Category Archives: « finishing the rat race »

final thoughts on ‘finishing the rat race’

“I’m a pessimist because of intelligence, but an optimist because of will.”
— Antonio Gramsci

It was summer 2006 when I last properly travelled. Disembarking in Athens, and then a few months later, in Beijing and Mumbai, I spent a few summer months in three very different countries. Once great civilisations of the ancient world, all were now facing momentous turnabouts. One was about to enter a shattering era of decline (not that this was evident seven years ago) while conversely the others were at the start of an historic upturn: a pair of tigers recovering their strength after a long slumber, and readied for a new ascendancy on the world stage. So one thought I’d carried with me was along the lines of which of the two would make for the least objectionable future superpower. It was not a happy question, of course, since like many people I’d rather there were no superpowers, but we also have to realistic.

My visits to these extraordinary countries of the East had been an unforgettable experience. We had journeyed across landscapes of inexpressible beauty, and visited some of the world’s most ancient temples, palaces and mausoleums. I’d eaten often strange but mostly very delicious food, and delighted in so many other oddities of two distinctive and complicated cultures. There were so many positives, however, both stays had also perturbed me greatly and in unexpected ways.

I had, for instance, fully anticipated that China, being a one-party and (notionally at least) Communist state, would be quite evidently so, with a highly visible police and military presence, and a population fearful that careless words might lead to sudden arrest and ‘re-education’ behind the razor-wire of some distant internment camp. Thousands of Chinese dissidents are indeed dealt with by such brutal tactics 1, as the Chinese people were well aware. Not that China is alone in operating secret or semi-secret detention centres, aka “black sites”, for those who are in effect political prisoners. Gitmo at Guantánamo Bay bears the proud motto “honor bound to protect freedom”. Ostensibly protecting American freedom by incarcerating without trial those the authorities deem a threat in violation of human rights and international law. In China, I was occasionally informed, all those interned at the ‘reeducation camps’ are extremist members of a dangerous religion called Falun Gong. For Falun Gong we might read “terror suspect”. And doubtless a few of those who guard the Chinese black sites feel “honour bound” too.

The Chinese state is certainly repressive but not overtly so. Greatly to my surprise, my host and his friends spoke freely not just in private but (more surprisingly) in public too; our conversations regularly straying off into the perilous waters of politics, economics, and the rights and wrongs of Chairman Mao. One evening I also spoke with this little group of friends about (what we call) the massacre at Tiananmen Square, and though personally too young to remember the events, each agreed that the story reported in the West was a distortion. The students had attacked the army first, they told me in turn, and the soldiers were forced to defend themselves. To underline this point it was my host who reminded me of “the tank man”. That incredibly brave soul who directly confronted a entire column of tanks of the People’s Liberation Army. Apparently the Chinese watched the same footage (although not in its entirety I presume). The soldiers were just trying to go around him, my host explained, with the others nodding agreement… but then, as we know, half truth is untruth.

As in the West, the Chinese bourgeoisie seemed unduly trusting in their government. Reluctant to protest against the excesses of their own authorities not principally out of fear, but more straightforwardly because their own lives are rather comfortable and contented. The government enjoys eudaimonistic legitimacy: conditions in China being very good, historically considered, and certainly for those lucky enough to move within the comparatively affluent circles of Chinese middle classes – and my friends’ families were all within the lower echelons of that circle. The extremes are invisible: the hardships of the sweatshop workers and worst of the slums hidden away; the heavily polluted industrial centres also well off the tourist trails; and regions where dissent is most concentrated, such as Tibet, strictly off-limits to nearly everyone.

The big giveaway came only after I’d arrived at the border. Crossing from the mainland into Hong Kong and suddenly held up by long queues at the checkpoints. It was here that I spoke with an English couple who were leaving after a commercial visit to the nearby electronics factories. They told me they were both delighted to be leaving, completely dismayed by what they had witnessed. The fourteen year old girls on production lines working sixteen hours for ten dollars a day. When I asked why the British firm they represented didn’t buy their components more locally, they shook their heads and told me that it’s impossible to compete. And the queues at checkpoint? Necessary precautions to hold back a flood of Chinese refugees who were desperate to join us.

India was a totally different story. In India the privation and misery is never very far beyond the hotel door. It is ubiquitous. So the most deeply shocking revelation about India (revelation for me at least) was how an upwardly mobile and already affluent few are able to look right past the everyday squalor. As unmindful, as much as apathetic, to its overwhelming ugliness and stench.

If I may briefly compare India to Tanzania, the immediate difference was an alarming one. For modern India is, and in countless ways, a comparatively wealthy nation composed of a growing middle class, a great many of whom are already earning considerably more money than I ever will, whereas Tanzania remains one of the poorest nations on earth. Yet, and leaving aside the similarities in terms of the obvious lack of infrastructural investment (which is bizarre enough given the gaping economic disparity), there was, at least as I perceived it, a greater level of equality in Tanzania: equality which made the abject poverty appear less shocking (after a while at least) if no less degrading. So India sickened me in a way that Tanzania had not, remaining as she does, more ‘Third World’ than one of the poorest and most ‘underdeveloped’ nations on earth.2

The overriding lessons from these journeys were therefore twofold. As a traveller to China, I had been greeted and treated quite differently to those who visited the former Eastern Bloc countries. No doubt the thousands of undercover spies exist, but in general this modern Chinese totalitarianism is slicker and more quietly efficient: the cogs of a police state meshing and moving but barely visible and mostly unheard. So China revealed how authoritarian rule can be installed and maintained with comparatively little in the way of outward signs. For instance, I saw less CCTV cameras in Tiananmen Square than I would have expected to find in Trafalgar Square.3 Whilst on our many journeys across the country, we encountered no road blocks or random checkpoints. Indeed, my entry into China had been far easier than my departure from Heathrow. The reason behind this being as clear (at least on reflection) as it was deeply troubling: that, as Orwell correctly foresees in Nineteen Eighty-Four, any forward-thinking police state must sooner or later aim to abolish thoughtcrime altogether.

From India, the important lesson had been much plainer, and my thoughts were firmed up after a conversation with an Italian stranger on our flight home. “We must never let this happen to our own countries,” he told to me solemnly, and almost as if aware in advance of the impending financial attack which is now impoverishing our own continents.

On returning, I decided to start work on a book. Not about the journeys themselves, but less directly inspired by them.

Seven years on and the future does not look especially prosperous for those in the East or the West. But it does appear that there is a convergence of sorts. The worst elements from modern China and India coming West, and, in exchange, the worst elements of our broken western socioeconomic systems continuing to be exported far and wide. Simultaneously, however, the desire for major political change is now arising in many nations. So broadly in the book, I challenge the direction the world is heading, looking forward to times in which people East and West might choose to reconfigure their societies to make them fit our real human needs much better.

In brief, the book tackles a range of interrelated subjects: from education and debt (closely linked these days); advertising and mental well-being (linked in another way, as I hope to show); to employment practices and monetary systems – these issues are covered in Part 2. Whereas in Part 1, the larger questions of how we view our own species, its relationship to other species, as well as to Nature more broadly, are also considered. A quest for answers which includes a different, but closely related question – what do science and religion have to tell us about this blooming, buzzing confusion and our place within it?4

The book is entitled finishing the rat race, since this is not merely desirable, but, provided the political will to do so, and driven by the careful but rapid development and application of new technologies, is certainly an achievable goal for every nation in the twenty-first century. It would mean, of course, a second Enlightenment, and unlike the first, one that blossoms over the whole world. Before this can happen, however, we collectively must grasp how perilous the political situation has become, whilst reminding ourselves always that the darkest hour is before the dawn.

1“China is thought to have the highest number of political prisoners of any country in the world. Human rights activists counted 742 arrests in 2007 alone. More recent estimates have put the number between 2,000 and 3,000. There is no way of knowing the total behind bars for “endangering state security” – the charge which in 1997 replaced “counter-revolution” in the Communist criminal code.”

From an article entitled “A welcome move, but thousands remain political prisoners” written by Paul Vallely, published by The Independent on June 23, 2011.

2 I make these statements on the basis of what I witnessed first-hand. There is however a purely quantitative method for comparing relative economic inequality that is known as the GINI index and based upon a remarkably simple and elegant formula generating a single number index ranging between 0 (for perfect equality) to 100 (for perfect inequality i.e., all the income going to s single individual). What is not so straightforward however is precisely how the statistics are determined for each of the different countries. So instead of one GINI index you will find (if you decide to look) that there are a number of alternative ones: the two main ones being produced by the World Bank and the CIA. But is either of these a truly reliable indicators using figures independently arrived at irrespective of any political motivations? Given the organisations involved we surely have to good reasons to be doubtful. And so what does it really tell us then when we learn that according to the CIA, at least, India is one place ahead of Tanzania and two places ahead of Japan? You can find the full CIA rankings at this link: Note that the date of the information varies considerably from country to country.

Likewise, what are we to judge when the World Bank indicator provides figures for India (33.9) and China (42.1) but offers no figures for Tanzania or Japan (to continue the comparisons from above)?

You can find the full UN World Bank ratings at this link:

3“Britain has one and a half times as many surveillance cameras as communist China, despite having a fraction of its population, shocking figures revealed yesterday.

There are 4.2million closed circuit TV cameras here, one per every 14 people.

But in police state China, which has a population of 1.3billion, there are just 2.75million cameras, the equivalent of one for every 472,000 of its citizens.”

From an article entitled “Revealed: Big Brother Britain has more CCTV cameras than China” written by Tom Kelly, published in The Daily Mail on August 11, 2009.–China.html

“With more than 100 such devices [i.e, CCTV cameras] Shetland (population 23,000) has more surveillance cameras than the entire San Francisco police department, which has just 71 CCTVs to cater for a population of 809,000.

So is Shetland an extreme one-off example? Hardly. the UK not only has more CCTV cameras than the world’s biggest dictatorship, China, we also have more cameras per person than anywhere else on the planet.”

From an article entitled “CCTV Britain: Why are we the most spied on country in the world?” written by Fergus Kelly, published in The Express on December 4, 2010.

Both of these articles were published a few years ago, whereas I, of course, had visited China seven years ago. There is plenty of evidence and every reason to suppose that mass surveillance will have increased there too.

4 I have stolen the phrase here from William James famous remarks about how “The baby, assailed by eyes, ears, nose, skin, and entrails at once, feels it all as one great blooming, buzzing confusion.” From The Principles of Psychology, ch.13, “Discrimination and Comparison”.

the life lepidopteran

The following article is an Interlude between Parts I and II of a book entitled Finishing The Rat Race which I am posting chapter by chapter throughout this year. Since blog posts are stacked in a reverse time sequence (always with the latest at the top), I have decided that the best approach is to post the chapters in reverse order.

All previously uploaded chapters are available (in sequence) by following the link above or from category link in the main menu, where you will also find a brief introductory article about the book itself and why I started writing it.


“Once upon a time, I, Chuang Chou, dreamt I was a butterfly, fluttering hither and thither, to all intents and purposes a butterfly. I was conscious only of my happiness as a butterfly, unaware that I was Chou. Soon I awaked, and there I was, veritably myself again. Now I do not know whether I was then a man dreaming I was a butterfly, or whether I am now a butterfly, dreaming I am a man.”

— Chuang Tzu 1


Before proceeding further, I’d like to tell a joke:

A man walks into a doctor’s.

“Doctor, Doctor, I keep thinking I’m a moth,” the man says.

The doctor gives him a serious look. “Sorry, but I am not strictly qualified to help you” he replies, rubbing his chin earnestly before adding after a momentary pause, “You really need to see a psychiatrist.”

“Yes,” says the man, “but your light was on.”


There can be no doubting that each of us acts to a considerable extent in accordance to mental processes that are distantly beyond and often alien to our immediate conscious awareness and understanding. For instance, in general we draw breath without the least consideration, or raise an arm, perhaps to scratch ourselves, with scarcely a thought and zero comprehension of how we actually moved our hand and fingers to accomplish the act. And this everyday fact becomes more startling once we consider how even complex movements and sophisticated patterns of behaviour seem to originate without full conscious direction or awareness?

Consider walking for instance. After admittedly painstaking practice as infants, we soon become able to walk without ever thinking to swing our legs. Likewise, if we have learnt to drive, eventually we are able to manoeuvre a large vehicle with hardly more conscious effort that we apply to walking. The same is true for most daily tasks which are performed no less thoughtlessly and that, in spite of the intricacies, we often find boring and mundane. For instance, those who have been smokers may be able to perform the rather complicated act of rolling a cigarette without pausing from conversation. Indeed, deep contemplation will probably leave us more bewildered than anything by the mysterious coordinated manipulation of all eight fingers and opposing thumbs.

Stranger still is that our ordinary conversational speech proceeds before we have formed the fully conscious intent to utter our actual words! When I first heard this claim, it struck me as so unsettling that I automatically rejected it outright in what perhaps ought to be called a tongue-jerk reaction. (Not long afterwards I was drunk enough to stop worrying about the latent implications!) For considered dispassionately, it is self-evident that there simply isn’t sufficient time to construct each and every utterance consciously and in advance of the act of speaking; so our vocal ejaculations (as they once were unashamedly called) are just that – they are thrown out! Still further proof is provided by instances when gestures or words emerge in direct conflict to our expressed beliefs and ideas. Those embarrassing occasions when we blurt out what we know must never be spoken we call Freudian slips (and more on Freud below).

More positively, and especially when we enter ‘the zone’, each of us is able to accomplish complex physical acts – for instance, throwing, catching, or kicking a ball –before any conscious thought arises to do so. Indeed, anyone who has played a sport long enough will recall joyous moments when they have marvelled not only at their own impossible spontaneity, but the accompanying accuracy, deftness, nimbleness, and on very rare occasions even of enhanced physical strength. Likewise, urges, feelings, fears and sometimes the most profound insights can suddenly spring forth into “the back of our minds”, as if from nowhere. As a consequence, this apparent nowhere was eventually given a name. It has come to be known as “the preconscious”, “the subconscious” and more latterly, “the unconscious”.

What this means, of course, is that “I” am not what I ordinarily think I am, but in actuality a lesser aspect of a greater being who enjoys remarkable and considerable talents and abilities beyond “my own” since they lie outside “my” immediate grasp. In this way, we all have hidden depths that ought to give rise to astonishment, although for peculiar reasons of pride, instead we tend to feign ignorance of this everyday fact.


The person most popularly associated with the study of the human unconscious is Sigmund Freud, a pioneer in the field but by no means a discoverer. In fact philosopher and all-round genius Gottfried Leibniz is one with a prior claim to the discovery; making the suggestion that our conscious awareness may be influenced by “insensible stimuli”, which he called petites perceptions 2; while another giant of German philosophy, Immanuel Kant, subsequently proposed the existence of ideas lurking of which we are not fully aware, while admitting the apparent contradiction inherent in such a conjecture:

“To have ideas, and yet not be conscious of them, — there seems to be a contradiction in that; for how can we know that we have them, if we are not conscious of them? Nevertheless, we may become aware indirectly that we have an idea, although we be not directly cognizant of the same.” 3

Nor is it the case that Freud was first in attempting any kind of formal analysis of the make-up and workings of the human psyche. Already in 1890, William James had published his ground-breaking work Principles of Psychology, and though James was keen to explore and outline his principles for human psychology by “the description and explanation of states of consciousness”, rather than to plunge more deeply into the unknown, he was also fully aware of the potentiality of unconscious forces and made clear that any “‘explanation’ [of consciousness] must of course include the study of their causes, conditions and immediate consequences, so far as these can be ascertained.” 4


William James’ own story is both interesting and instructive. As a young man he had been at somewhat of a loss to decide what to do with himself. Having briefly trained as an artist, he quickly realised that he’d never be good enough and became disillusioned with the idea, declaring that “there is nothing on earth more deplorable than a bad artist”. He afterwards retrained in chemistry, enrolling at Harvard in 1861 (a few months after the outbreak of the American Civil War), but restless again, twelve months or so later, transferred to biology. Still only twenty-one, James already felt that he was running out of options, writing in a letter to his cousin:

“I have four alternatives: Natural History, Medicine, Printing, Beggary. Much may be said in favour of each. I have named them in the ascending order of their pecuniary invitingness. After all, the great problem of life seems to be how to keep body and soul together, and I have to consider lucre. To study natural science, I know I should like, but the prospect of supporting a family on $600 a year is not one of those rosy dreams of the future with which the young are said to be haunted. Medicine would pay, and I should still be dealing with subjects which interest me – but how much drudgery and of what an unpleasant kind is there!”

Three years on, James entered the Harvard Medical School, where he quickly became disillusioned again. Certain that he no longer wished to become a practicing doctor, and being more interested in psychology and natural history than medicine, a new opportunity then arose, so he set sail to the Amazon in hopes of becoming a naturalist. However, the expedition didn’t work out well either. Fed up with collecting bugs and bored with the company of his fellow explorers, to cap everything, he fell quite ill. Although desperate to return home, he was obliged to continue, and, slowly he regained his strength, deciding that in spite of everything it had been a worthwhile diversion; no doubt heartened too by the prospect of finally returning home.

It was 1866, when James next resumed medical studies at Harvard although the Amazon adventure had left him physically and (very probably) psychologically weakened; a continuing sickness that forced James to break off from his studies yet again. Seeking rest and recuperation, for the next two years James sojourned in Europe, where, to judge from his own accounts, he again experienced a great deal of isolation, loneliness and boredom. Returning to America at the end of 1868 – now approaching twenty-seven years old – he picked up his studies at Harvard for the last time, successfully passing his degree to become William James M.D. in 1869.

Too weak to find work anyway, James stayed resolute in his unwillingness to become a practicing doctor. For a prolonged period, he did nothing at all, or next to nothing. Three years passed when, aside from the occasional publication of articles and reviews, he devoted himself solely to reading books or thinking thoughts, and often gloomy ones. Then, one day, he had a quite miraculous revelation: a very dark revelation that made him suddenly exceedingly aware of his own mental fragility:

“Whilst in this state of philosophic pessimism and general depression of spirits about my prospects, I went one evening into the dressing room in the twilight… when suddenly there fell upon me without any warning, just as if it came out of the darkness, a horrible fear of my own existence. Simultaneously there arose in my mind the image of an epileptic patient whom I had seen in the asylum, a black-haired youth with greenish skin, entirely idiotic, who used to sit all day on one of the benches, or rather shelves, against the wall, with his knees drawn up against his chin, and the coarse gray undershirt, which was his only garment, drawn over them, inclosing his entire figure. He sat there like a sort of sculptured Egyptian cat or Peruvian mummy, moving nothing but his black eyes and looking absolutely non-human. This image and my fear entered into a species of combination with each other. That shape am I, I felt, potentially. Nothing that I possess can defend me against that fate, if the hour for it should strike for me as it struck for him. There was such a horror of him, and such a perception of my own merely momentary discrepancy from him, that it was as if something hitherto solid within my breast gave way entirely, and I became a mass of quivering fear. After this the universe was changed for me altogether. I awoke morning after morning with a horrible dread at the pit of my stomach, and with a sense of the insecurity of life that I never knew before, and that I have never felt since. It was like a revelation; and although the immediate feelings passed away, the experience has made me sympathetic with the morbid feelings of others ever since.” 5

Having suffered what today would very likely be called ‘a nervous breakdown’, James was forced to reflect on the current theories of the mind. Previously, he had accepted the materialist ‘automaton theory’ – that our ability to act upon the world depends not upon conscious states as such, but upon the brain-states that underpin and produce them – but now he felt that if true this meant he was personally trapped forever in a depression that could only be cured by some kind of physical remedy. With no such remedy forthcoming he was forced, however, to tackle his own disorder by further introspection and self-analysis.

James read more and thought more since there was nothing else he could do. Three more desperately unhappy years would pass before he had sufficiently recuperated to rejoin the ordinary world, accepting an offer to become lecturer in physiology at Harvard. But as luck would have it, teaching suited James. He enjoyed the subject of physiology itself, and found the activity of teaching “very interesting and stimulating”. James had, for once, landed on his feet, and his fortunes were also beginning to improve in other ways.

Enjoying the benefits of a steady income for the first time in his life, he was soon to meet Alice Gibbons, the future “Mrs W.J.” They married two years later in 1878. She was a perfect companion – intelligent, perceptive, encouraging, and perhaps most importantly for James, an organising force in his life. He had also just been offered a publishing contract to write a book on his main specialism, which was by now – and in spite of such diversity of training – most definitely psychology. With everything now in place, James set to work on what would be his magnus opus. Wasting absolutely no time whatsoever, the opening chapters were drafted while still on their honeymoon.

“What is this mythological and poetical talk about psychology and Psyche and keeping back a manuscript composed during honeymoon?” he wrote in jest to the taunts of a friend, “The only psyche now recognized by science is a decapitated frog whose writhings express deeper truths than your weak-minded poets ever dreamed. She (not Psyche but the bride) loves all these doctrines which are quite novel to her mind, hitherto accustomed to all sorts of mysticisms and superstitions. She swears entirely by reflex action now, and believes in universal Nothwendigkeit. [determinism]” 6

It took James more than a decade to complete what quickly became the definitive university textbook on the subject, ample time for such ingrained materialist leanings to have softened. For the most part sticking to what was directly and consciously known to him, his attempts to dissect the psyche involved much painstaking introspection of what he famously came to describe as his (and our) “stream of consciousness”. Such close analysis of the subjective experience of consciousness itself had suggested to James the need to distinguish between “the Me and the I” as separate component parts of what in completeness he called “the self”. 7 In one way or another, this division of self into selves, whether these be consciously apprehensible or not, has remained a theoretical basis of all later methods of psychoanalysis.

There is a joke that Henry James was a philosopher who wrote novels, whereas his brother William was a novelist who wrote philosophy. But this does WJ a disservice. James’ philosophy, known as pragmatism, is a later diversion. Unlike his writings about psychology, which became the standard academic texts, as well as popular best-sellers (and what better tribute to James’s fluid prose); his ideas on pragmatism were rather poorly received. But then James was a lesser expert in philosophy, a situation not helped by his distaste for logical reasoning; and he would be better remembered for his writings on psychology, a subject in which he excelled. Freud’s claim to originality is nothing like as foundational.

For James had been at the vanguard just as psychology was pulling irreparably apart from the grip that philosophy had held so long (which explains why James was notionally Professor of Philosopher at the time he was writing), to be grafted back again as a part of biology. For this reason, and regardless that James remained as highly critical of the developing field of experimental psychology, as he was of the deductive reasoners on both sides of the English Channel – the British Empiricists of Locke and Hume, and the continental giants Leibnitz, Kant and Hegel – to some of his contemporaries, James’s view appeared all too dangerously materialistic. If only they could have seen how areas of psychology were to so ruinously develop, they would have appreciated that James was, as always, a moderate.


Whereas James would remain an academic throughout his life, Freud, though briefly studying zoology at the University of Vienna, one month spent unsuccessfully searching for the gonads of the male eel 8, turned next to neurology but then decided to return to medicine and open his own practice. Freud also received expert training in the new-fangled techniques of hypnosis.

‘Hypnosis’ comes from the Greek hupnos and means, in effect, “artificial sleep”. To induce hypnosis, a patient’s conscious mind needs to be distracted briefly. Achieving this brings communion with a part of the patient’s mind which appears to be something other than the usual conscious state. This other was bound to be given a name, and so it is really not surprising at all that the terms “sub-conscious” and “unconscious” had been in circulation already and prior to the theories of Freud or James. But named or otherwise, mysterious evidence of the unconscious has always been known. Dreams, after all, though we consciously experience them, are neither consciously conceived nor willed. They just pop out from nowhere – or from “the unconscious”.

And Freud soon realised that there were better routes to the unconscious than hypnosis. For instance, he found that it was just as effective to listen to his patients, or if their conscious mind was still unwilling to give up some of its defences, to allow their free association of words and ideas. He also looked for unconscious connections within his patients’ dreams, gradually uncovering, what he came to believe were the deeply repressed animalistic drives that governed their fears, attitudes and behaviour. Having found the unconscious root to their problems, the patient might at last begin to consciously grapple with it. It was a technique that apparently worked, with many of Freud’s patients recovering from the worst effects of their neuroses and hysteria, and so “the talking cure” became a lasting part of Freud’s legacy. You lay on the couch, and just out of sight, Freud listened and interpreted.

But Freud also left a bigger mark, by helping to shape the way we see ourselves. Drawing directly on his experiences as doctor, he had slowly excavated, as he found it, the human unconscious piece by piece: and quite aside from aspects he labelled the superego and the id, Freud claimed to have discovered the existence of the libido: a primary, sexual drive that operated beneath our conscious awareness, prompting our desires for pleasure and avoidance of pain regardless of whether these desires conflicted with ordinary social conventions. Freud discerned a natural process of psychological development 9 and came to believe that whenever this process is arrested or, more generally, whenever normal instinctual appetites are consciously repressed, then lurking deep within the unconscious, the deeply repressed desires will automatically resurface in more morbid forms. This was, he determined, the common root cause for all of his patient’s various symptoms and illnesses.

Had Freud stopped there, then I feel that his contribution to psychology would have been fully commendable, for there is considerable truth in what he is telling us. He says too much no doubt (especially when it comes to the specifics of human development), but he also says something that needed to be said most urgently: that if you force people to behave against their natures then you will very likely make them sick. However, Freud took his ideas a great deal further.

And so we come to the ‘Oedipus complex’, which of the many Freudian features of our supposed psychological nether regions, is without doubt the one of greatest notoriety. The myth of Oedipus is a fascinating one in which the hero by his encounters is compelled to deal with fate, misfortune and prophesy. 10 But Freud finds in this tale, a revelation of deep and universal unconscious repression:

“[Oedipus’s] destiny moves us only because it might have been ours – because the Oracle laid the same curse upon us before our birth as upon him. It is the fate of all of us, perhaps, to direct our first sexual impulse towards our mother and our first hatred and our first murderous wish against our father. Our dreams convince us that this is so.” 11

Freud generally studied those with minor psychological problems, determining on the basis of an unhappy few, what he presumed true for healthier individuals too, and this is perhaps a failure of all psychoanalytic theories. It seems odd that he came to believe in the universality of the Oedipus Complex, but then who can doubt that he didn’t suffer from it greatly? Perhaps he also felt a ‘castration anxiety’ as a result of the Oedipal rivalry he’d had with his own father. Maybe he even experienced “penis envy”, if not of the same intensity as he said he detected in his female patients, but of a compensatory masculine kind. After all, such unconscious ‘transference’ of attitudes and feelings from one person to another – from patient onto the doctor, or vice versa in this relevant example – is another concept that Freud was first to identify and label.


Given the prudish age in which Freud had fleshed out his ideas, it seems surprising perhaps how swiftly these theories received any widespread acceptance and acclaim, although I can think of two good reasons why Freudianism took hold. The first is straightforward: that society had been very badly in need of a dose of Freud, or something very like Freud. After so much repression, the pendulum was bound to swing the other way. But arguably the more important reason – indeed the reason his theories have remained influential – is that Freud picked up the baton directly from where Darwin left off. Restricting his explanations to biological instincts and drives, Freudianism has the guise of scientific legitimacy, and this was a vital determining factor that helped to secure its prominent position within the modern epistemological canon.

Following his precedent, students of Freud, most notably Carl Jung and Alfred Adler, also drew on clinical experiences with their own patients, but gradually came to the conclusion, for different reasons, that Freud’s approach was too reductionist, and that there is considerably more to a patient’s mental well-being than healthy appetites and desires, and thus more to the psychological underworld than matters of sex and death.

Where Freud was a materialist and an atheist, Jung went on to incorporate aspects of the spiritual into his extended theory of the unconscious, though he remained respectful to biology and keen to anchor his own theories upon an evolutionary bedrock. Jung nevertheless speculates following a philosophical tradition that owes much to Kant, while also drawing heavily on personal experience, and comes to posit the existence of psychical structures he calls ‘archetypes’ operating at the deepest levels within a collective unconscious; a shared characteristic due to our common ancestry.

Thus he envisions ‘the ego’ – the aspect of our psyche we identify as “I” – as existing in relation to an unknown and finally unknowable sea of autonomous entities which have their own life. Jung actually suggests that Freud’s Oedipus complex is just one of these archetypes, while he finds himself drawn by the bigger fish of the unconscious beginning with ‘The Shadow’ – what is hidden and rejected by the ego – and what he determines are the communicating figures of ‘Anima/Animus’ (he also calls these ‘The Syzygy’) that prepare us for incremental and never-ending revelations of our all-encompassing ‘Self’. This lifelong psychical development, or ‘individuation’, was seen by Jung an inherently religious quest and he is unapologetic in proclaiming so; the religious impulse being simply another product of our evolutionary development along with opposable thumbs and walking upright. However, rather than a mere vestigial hangover, religion is, for Jung, fundamental to the deep nature of our species.

Unlike Freud, Jung was also invested in understanding how the human psyche varies greatly from person to person, and to these ends introduced new ideas about character types, adding ‘introvert’ and ‘extrovert’ to the psychological lexicon to draw a clear division between individuals characterised either by primarily subjective or objective orientations to life – an introvert himself, Jung was able to see a distinction. Meanwhile, Adler was more concerned with our social identities, and why people felt – in very many cases quite irrationally – inferior or superior to others. Adler’s efforts culminated in the development of a theory of the ‘inferiority complex’ – which might also be thought of as an aspect of the Jungian ‘Shadow’.

These different schools of psychoanalysis are not irreconcilable. They are indeed rather complimentary in many ways: Freud tackling the animal craving and want of pleasure; Jung looking for expression above and beyond what William Blake once referred to as “this vegetable world”; and Adler delving most directly into the mud of human relations, and the lasting effects of personal trauma associated with social inequalities.

Freud presumes that since we are biological products of Darwinian evolution, then our minds have been evolutionarily pre-programmed. Turning the same inquiry outward, Jung goes in a search of common symbolic threads within mythological and folkloric traditions, enlisting these as evidence for the psychological archetypes buried deep within us all. And though Jung held no orthodox religious views of his own, he felt perfectly comfortable drawing upon religious (including overtly Christian) symbolism. In one of his most contemplative passages, Jung wrote:

Perhaps this sounds very simple, but simple things are always the most difficult. In actual life it requires the greatest art to be simple, and so acceptance of oneself is the essence of the moral problem and the acid test of one’s whole outlook on life. That I feed the beggar, that I forgive an insult, that I love my enemy in the name of Christ—all these are undoubtedly great virtues. What I do unto the least of my brethren, that I do unto Christ.

But what if I should discover that the least amongst them all, the poorest of all beggars, the most impudent of all offenders, yea the very fiend himself—that these are within me, and that I myself stand in need of the alms of my own kindness, that I myself am the enemy who must be loved—what then? Then, as a rule, the whole truth of Christianity is reversed: there is then no more talk of love and long-suffering; we say to the brother within us “Raca,” and condemn and rage against ourselves. We hide him from the world, we deny ever having met this least among the lowly in ourselves, and had it been God himself who drew near to us in this despicable form, we should have denied him a thousand times before a single cock had crowed. 12

Of course, “the very fiend himself” is the Jungian ‘Shadow’, the contents of which without recognition and acceptance then inevitably remain repressed, causing our thoughts and actions to be governed unconsciously and for these unapproachable and rejected aspects of our own psyche to be projected out on to the world. ‘Shadow projection’ onto others fills the world with enemies of our own imagining; and this, Jung believed, was the root of nearly all evil. Alternatively, by taking Jung’s advice and accepting “that I myself am the enemy who must be loved”, we come back to ourselves in wholeness, and then not only does the omnipresent threat of the Other diminish, but the veil of illusion between the ego and reality is thinned, with access to previously hidden strengths further enabling us to reach our fuller potential. 13

Today there are millions doing “shadow work” as it is now popularly known: self-help exercises often combined with traditional practices of yoga, meditation or the ritual use of entheogens: so here is a new meeting place – a modern mash-up – of religion and psychotherapy. Quietly and individually, a shapeless movement has arisen almost spontaneously as a reaction to the peculiar rigours of western civilisation. Will it change the world? For better or worse, it already has.

Alan Watts who is best known for his Western interpretations of Eastern spiritual traditions and in particular Zen Buddhism and Daoism, here reads this same influential passage from one of Jung’s lectures in which he speaks of ending “the inner civil war”:


Now what about my joke at the top? What’s that all about? Indeed, and in all seriousness, what makes it a joke at all? Well, not wishing to delve deeply into theories of comedy, there is one structure that arises repeatedly and nearly universally: that the punch line to every joke relies on some kind of unexpected twist on the set up…

“Why did the chicken cross the road?” Here we find an inherent ambiguity that lies within us of the word “why” and this is what sets up the twist. However, in the case of the joke about the psychiatrist and the man who thinks he’s a moth, the site of ambiguity isn’t so obvious. But here the humour I think comes down to our different and conflicting notions of ‘belief’.

A brief digression then: What is belief? To offer a salient example, when someone tells you “I believe in God”, what are they intending to communicate? No less importantly, what would you take them to mean? Put differently, atheists will very often say “I don’t believe in anything” – so again, what are they (literally) trying to convey here? And what would a listener take them to mean? Because in all these instances the same word is used to describe similar but distinct attitudinal relationships to reality, so it is easy to presume that everyone is using the word in precisely the same way, which immediately becomes less certain once we acknowledge that the word “belief” actually carries two quite distinct meanings.

According to the first definition, it is “a mental conviction of the truth of an idea or some aspect of reality”. Belief in UFOs fits this criterion, as does a belief in gravity and that the sun will rise again tomorrow. How about belief in God? When late in life Jung was asked if he believed in God, he replied straightforwardly “I know”. 14 Others reply with just the same degree of conviction when asked about angels, fairies, spirit guides, ghosts or the power of healing and crystals. As a physicist, I believe in the existence of atoms, electrons and quarks – although I’ve never “seen one”, like Jung I know! And belief in this sense is more often than not grounded in a person’s direct experience/s which obviously doesn’t go to validate the objective truth of their belief. He saw a ghost. She was healed by the touch of a holy man. We ran experiments to measure the charge on an electron. Again, in this sense I have never personally known of anyone who did not believe in the physical reality of a world of solid objects – for who doesn’t believe in tables and chairs? In this important sense everyone has many convictions about the truth of reality, and we surely all believe in something – this applies even in the case of the most hardline of atheists!

But there is also a second kind of belief: “of an idea that is believed to be true or valid without positive knowledge.” The emphasis here is on the lack of knowledge or indeed of direct experience. So this belief involves an effort of willing on the part of the believer. In many ways, this is to believe in make-believe, or we might just say “to make-believe”: to pretend or wish that something is real. I believe in unicorns…

As a child all religion was utterly mystifying to me since what was clearly make-believe was for reasons I couldn’t comprehend being held up as sacrosanct. Based on my fleeting encounters with Christianity, it also seemed evident that the harder you tried to make-believe in this maddening mystification of being, the better a person it made you! So when someone says they believe in God, is this all they actually mean? That they are trying very hard to make-believe in impossibility? Indeed, is this striving alone mistaken not only as virtuous but as actual believing in the first sense? Yes, quite possibly – and not only for religious types. As Kurt Vonnegut wrote in the introduction of his novel Mother Night: “This is the only story of mine whose moral I know”, continuing: “We are what we pretend to be, so we must be careful about what we pretend to be.” 15

Alternatively, it may be that someone truly believes in God – or whatever synonym they choose to approximate to ‘cosmic higher consciousness’ – with the same conviction that all physicists believe in gravity and atoms. They may come to know ‘God’, as Jung did.

Now back to the joke and apologies for killing it: The man complains that he feels like a moth and this is so silly that we automatically presume his condition is entirely one of make-believe. But then the twist, when we learn that he is acting according to his conviction, which means he has true belief of the first kind. Here’s my hunch then for why we find this funny: it spontaneously reminds us of how true beliefs – rather than make-believe – both inform reality as we perceive it, and fundamentally shape our behaviour. So here we come back to Vonnegut’s moral. Yet we are always in the process of forgetting altogether that this is how we also live, until abruptly the joke reminds us again – and in our moment of recollecting, spontaneously we laugh.


And the joke was hilarious wasn’t it? No, you didn’t like it….? Well, if beauty is in the eye of the beholder, comedy surely lies in the marrow of the funny bone! Which brings me to ask why there is comedy? More broadly, why is there laughter – surely the most curious human reflex of all – or its very closely-related reflex cousin, crying. In fact, the emission of tears from the nasolacrimal ducts not in response to irritation of our ocular structures but purely for reasons of joy or sorrow is a very nearly uniquely human secretomotor phenomenon. (Excuse my Latin!) 16

The jury is still out on evolutionary function of laughing and crying, but when considered in strictly Darwinian terms (as current Science insists), it is hard to fathom why these dangerously debilitating and a potentially life threatening responses ever developed in any species. It is acknowledged indeed that a handful of unlucky (perhaps lucky?) people have literally died from laughter. So why do we laugh? Why do we love laughter, whether ours or others, so much? Your guess is as good as mine, and, more importantly, as good as Darwin’s.

Less generously, Thomas Hobbes, who explained all human behaviour in terms of gaining social advantage, wrote that:

Joy, arising from imagination of a man’s own power and ability, is that exultation of the mind which is called glorying… Sudden Glory, is the passion which maketh those grimaces called LAUGHTER; and is caused either by some sudden act of their own, that pleaseth them; or by the apprehension of some deformed thing in another, by comparison whereof they suddenly applaud themselves. 17

And yes, it is true that a great deal of laughter is at the expense of some butt of our joking, however not all mockery involves an inflicted party and there’s a great deal more to humour and laughter than merely ridicule and contempt. So Hobbes’ account is at best a very desiccated postulation for why humans laugh let alone what constitutes joy.

Indeed, Hobbes’ reductionism is evidently mistaken and misinformed not only by his deep-seated misanthropy, but also by a seeming lack of common insight which leads one to suspect that when it came to sharing any jokes, he just didn’t get it. But precisely what didn’t he get?

Well, apparently he didn’t get how laughter can be a straightforward expression of joie de vivre. Too French I imagine! Or that when we apprehend anything and find it amusing, this momentarily snaps us from a prior state of inattention and on the occasion of finding joy in an abrupt, often fleeting, but totally fresh understanding, the revelation itself may elicit laughter (as I already outlined above). Or that it is simply impossible to laugh authentically or infectiously unless you not only understand the joke, but fully acknowledge it. In this way, humour, if confessional, may be liberating at a deeply personal level, or if satirical, liberating at a penetrating societal level. Lastly (in my limited rundown), humour serves as a wonderfully efficient and entertaining springboard for communicating insight and understanding, especially when the truths are dry, difficult to grasp or otherwise unpalatable. Here is a rhetorical economy that Hobbes might actually have approved were it not for his somewhat curmudgeonly disposition.

And why tell a joke here? Just to make you laugh and take your mind off the gravity of the topics covered and still more grave ones to come? To an extent, yes, but also to broaden out our discussion, letting it drift off into related philosophical avenues. For existence is seemingly absurd, is it not? Considered squarely, full-frontal, what’s it all about…? And jokes – especially ones that work beyond rational understanding – offer a playful recognition of the nonsensicalness of existence and of our species’ farcical determination to comprehend it and ourselves fully. What gives us the gall to ever speculate on the meaning of life, the universe and everything?

Meanwhile, we are free to choose: do we laugh or do we cry at our weird predicament. Both responses are surely sounder than cool insouciance, since both are flushed with blood. And were we madder, we might scream instead, of course, whether in joy or terror.

French existentialist Albert Camus famously made the claim: “There is but one truly serious philosophical problem and that is suicide.” 18 Camus was not an advocate of suicide, of course; far from it. In fact, he saw it as perfectly vain attempt to flee from the inescapable absurdity of life, something he believed we ought to embrace in order to live authentically. Indeed, Camus regarded every attempt to deny the primacy of ultimate meaninglessness of life in a universe that is indifferent to our suffering as a surrogate form of psychological suicide.

But rather than staring blankly into the abyss, Camus urges us to rebel against it. To face its absurdity without flinching, and through rebellion, by virtue of which we individually reconstruct the meaning of our lives afresh, albeit paradoxically, we shall then come to face extreme rationality. Although perhaps he goes too far, and reaches a point so extreme that few can follow: such a Sisyphean outlook being too desolate for most of us, and his exhortation to authenticity so impassioned that it seems almost infinitely taxing. 19 Kierkegaard’s “leap of faith” is arguably more forgiving of the human condition – but enough of philosophy. 20

This pause is meant for introspection. I have therefore presented an opportunity to reconsider how my interlude set out, not only by telling a joke – and hopefully one that made you smile if not laugh out loud – but also to reflect upon the beautiful wisdom encapsulated in Chuang Tzu’s dream of becoming a butterfly; mystical enlightenment from 4th century BC China, that in turn clashes against the plain silliness of a doctor-doctor joke about a moth-man; a surreal quip that also brushes on the use of clinical diagnosis in modern psychiatry (something I shall be coming to consider next).

The running theme here is one of transformation, and at the risk of also killing Chuang Tzu’s message by dissection, I will simply add (unnecessarily from the Daoist perspective) that existence does appear to be cyclically transformative; at both personal and collective levels, the conscious and unconscious, spiralling outwards – whether upward into light or downward into darkness – each perpetually giving rise to the other just like the everblooming of yang and yin. As maverick clinical psychiatrist R. D. Laing once wrote:

“Most people most of the time experience themselves and others in one way or another that I… call egoic. That is, centrally or peripherally, they experience the world and themselves in terms of a consistent identity, a me-here over against you-there, within a framework of certain ground structures of space and time shared with other members of their society… All religious and all existential philosophies have agreed that such egoic experience is a preliminary illusion, a veil, a film of maya—a dream to Heraclitus, and to Lao Tzu, the fundamental illusion of all Buddhism, a state of sleep, of death, of socially accepted madness, a womb state to which one has to die, from which one has to be born.” 21

Returning from the shadowlands of alienation to contemplate again the glinting iridescent radiance of Tzu’s butterfly’s wings is invitation to scrape away the dross of habituated semi-consciousness that veils the playful mystery of our minds. On a different occasion, Tzu wrote:

One who dreams of drinking wine may in the morning weep; one who dreams weeping may in the morning go out to hunt. During our dreams we do not know we are dreaming. We may even dream of interpreting a dream. Only on waking do we know it was a dream. Only after the great awakening will we realize that this is the great dream. And yet fools think they are awake, presuming to know that they are rulers or herdsmen. How dense! You and Confucius are both dreaming, and I who say you are a dream am also a dream. Such is my tale. It will probably be called preposterous, but after ten thousand generations there may be a great sage who will be able to explain it, a trivial interval equivalent to the passage from morning to night. 22

Thus the world about us is scarcely less a construct of our imagination than our dreams are, deconstructed by the senses then seamlessly reconstructed in its entirety. And not just reconfigured via inputs from the celebrated five gateways of vision, sound, touch, taste and smell, but all portals including those of memory, intuition, and even reason. After all, it is curious how we speak of having ‘a sense’ of reason, just as we do ‘a sense’ of humour. Well, do we have… a sense of reason and a sense of humour? If you have followed this far then I sense you may share my own.

Next chapter…


Richard Rohr is a Franciscan priest, author and teacher, who says that his calling has been “to retrieve and re-teach the wisdom that has been lost, ignored or misunderstood in the Judeo-Christian tradition.” Rohr is the founder of the Center for Action and Contemplation and academic dean of the CAC’s Living School, where he practises incarnational mysticism, non-dual consciousness, and contemplation, with a particular emphasis on how these affect the social justice issues of our time. Recently he shared his inspirational standpoint in an hour-long chat with ‘Buddha at the Gas Pump’s Rick Archer:


Addendum: anyone with half a brain

“The intuitive mind is a sacred gift and the rational mind is a faithful servant. We have created a society that honors the servant and has forgotten the gift.”

— attributed to Albert Einstein 23


The development of split-brain operations for the treatment of severe cases of epilepsy, which involves the severing of the corpus callosum, a thick web of nerves that allow communication between the two hemispheres, first drew attention to how left and right hemispheres have quite different attributes. Unfortunately, the early studies in this field produced erroneous since superficial notions about left and right brain functions that were in turn vulgarised and popularised when they percolated down into pop psychology and management theory. The left brain was said to generate language and logic; while it was only the right brain which supposedly dealt with feelings and was the creative centre. In reality, both hemispheres are involved in all aspects of cognition, and as a consequence the study of what is technically called the lateralisation of brain function fell to some extent into academic disrepute.

In fact, important differences do occur between the specialism of the left and right hemispheres, although as psychiatrist Iain McGilchrist proposes in this book The Master and His Emissary (which he sees as the proper roles of the right and left hemispheres respectively) 24, it is often better to understand the distinctions in terms of where conscious awareness is placed. In summary, the left hemisphere attends to and focuses narrowly but precisely on what is immediately in front of you, allowing you to strike the nail with the hammer, thread the eye of the needle, sort the wheat from the chaff (or whatever activity you might be actively engaged with), while the right hemisphere remains highly vigilant and attentive to the surroundings. Thus, the left brain operates tools and usefully sizes up situations, while the right brain’s immediate relationship to the environment and to our bodies makes it the mediator to social activities and to a far broader conscious awareness. However, according to McGilchrist, the left brain is also convinced of its primacy, whereas the right is incapable of comprehending such hierarchies, which is arguably the root of a problem we all face, since it repeatedly leads humans to construct societal arrangements and norms in accordance with left brain dominance and so to the inevitable detriment of less restricted right brain awareness.

Supported by many decades of research, this has become the informed view of McGilchrist, and given that his overarching thesis has merit – note that the basic distinctions between left and right brain awareness are uncontroversial and well understood in psychology, whereas what he sees as the socio-historical repercussions is more speculative – then it raises brain function lateralisation as major underlying issue that needs to be incorporated in any final appraisal of ‘human nature’, the implications of which McGilchrist propounds at length in his own writing. In the preface to the new expanded edition of The Master and His Emissary (2009), he writes:

I don’t want it to be possible, after reading this book, for any intelligent person ever again to see the right hemisphere as the ‘minor’ hemisphere, as it used to be called – still worse the flighty, impetuous, fantastical one, the unreliable but perhaps fluffy and cuddly one – and the left hemisphere as the solid, dependable, down-to-earth hemisphere, the one that does all the heavy lifting and is alone the intelligent source of our understanding. I might still be to some extent swimming against the current, but there are signs that the current may be changing direction.


Embedded below is a lecture given to the Royal Society of Arts (RSA) in 2010, in which he offers a concise overview of how according to our current understanding, the ‘divided brain’ has profoundly altered human behaviour, culture and society:

To hear these ideas contextualised within an evolutionary account of brain laterality, I also recommend a lecture given to The Evolutionary Psychiatry Special Interest Group of the Royal College of Psychiatrists in London (EPSIG UK) in 2018:

For more from Iain McGilchrist I also recommend this extended interview with physicist and filmmaker Curt Jaimungal, host of Theories of Everything, which premiered on March 29th:


Please note that for the purposes of ‘publishing’ here I have taken advantage of the option to incorporate hypertext links and embed videos – in order to distinguish additional commentary from the original text all newly incorporated text has been italised.


1 Quoted from the book known as Zhuangzi (also transliterated as Chuang Tzu or Chuang Chou). Translation by Lin Yutang

2 “insensible perceptions are as important to [the science of minds, souls, and soul-like substances] as insensible corpuscles are to natural science, and it is just as unreasonable to reject the one as the other on the pretext that they are beyond the reach of our senses.” from Preface of New Essays concerning Human Understanding  by Gottfried Leibnitz, first published in 1704, translation courtesy of Stanford Encyclopedia of Philosophy.

3 From Anthropology from a Pragmatic Point of View by Immanuel Kant, first published in 1798.

4 “The definition of Psychology may be best given… as the description and explanation of states of consciousness as such. By states of consciousness are meant such things as sensations, desires, emotions, cognitions, reasonings, decisions, volitions, and the like. Their ‘explanation’ must of course include the study of their causes, conditions, and immediate consequences, so far as these can be ascertained.” from opening paragraph of “Introduction: Body and Mind” from The Principles of Psychology, by William James, first published in 1892.

5 Extract taken from The Varieties of Religious Experience, from chapter on “The Sick Soul”.

6 Letter to his friend, Francis Child.

7 According to James, the first division of “the self” that can be discriminated is between “the self as known”, the me, and “the  self as knower”, the I, or “pure ego”. The me he then suggests might be sub-divided in a constituent hierarchy: “the material me” at the lowest level, then “the social me” and top-most “the spiritual me”. It was not until very much later in the 1920s when Freud had fully developed his own tripartite division of the psyche in id, ego and super-ego, a division that surely owes much to James.


In the spring of 1876, a young man of nineteen arrived in the seaside city of Trieste and set about a curious task. Every morning, as the fishermen brought in their catch, he went to meet them at the port, where he bought eels by the dozens and then the hundreds. He carried them home, to a dissection table in a corner of his room, and—from eight until noon, when he broke for lunch, and then again from one until six, when he quit for the day and went to ogle the women of Trieste on the street—he diligently slashed away, in search of gonads.

“My hands are stained by the white and red blood of the sea creatures,” he wrote to a friend. “All I see when I close my eyes is the shimmering dead tissue, which haunts my dreams, and all I can think about are the big questions, the ones that go hand in hand with testicles and ovaries—the universal, pivotal questions.”

The young man, whose name was Sigmund Freud, eventually followed his evolving questions in other directions. But in Trieste, elbow-deep in slime, he hoped to be the first person to find what men of science had been seeking for thousands of years: the testicles of an eel. To see them would be to begin to solve a profound mystery, one that had stumped Aristotle and countless successors throughout the history of natural science: Where do eels come from?

From an article entitled “Where Do Eels Come From?” written by Brooke Jarvis, published in New Yorker magazine on May 18, 2020.

9 Fixing on specific erogenous zones of the body, Freud believed that libidinous desire shaped our psychological development in a very specific fashion, naturally progressing, if permitted, through early stages from oral, to anal, and, then reaching adulthood, to genital.

10 Jocasta, the queen of Thebes, is barren, and so she and her husband, the king Laius, decide to consult the Oracle of Delphi. The Oracle tells them that if Jocasta bears a son, then the son will kill his father and marry her. Later, when Jocasta does indeed have a son, Laius demands that a servant take the baby to a mountain to be abandoned, his ankles pinned together just in case. But Oracles are rarely mistaken, fate is hard to avoid, and so as it happens the servant spares the infant, giving him to a shepherd instead. Eventually, as fortune will have it, the infant is adopted by the king and queen of Corinth, and named Oedipus because of the swellings on his feet. Years pass. Then, one day Oedipus learns that the king and queen are not his parents, but when he asks them, they deny the truth. So Oedipus decides put the question to the Oracle of Delphi instead, who being an enigmatic type, refuses to identify his true parents, but foretells his future instead, saying that he is destined to kill his father and marry his mother. Determined to avoid this, Oedipus determines not to return home to Corinth, heading to, you guessed it, Thebes instead. He comes to an intersection of three roads and meets Laius driving a chariot. They argue about who has the right of way and then, in an early example of road rage, their rage spills into a fight and thus Oedipus unwittingly kills his real father. Next up, he meets the sphinx, who asks its famous riddle. This is a question of life and death, all who have incorrectly answered having been killed and eaten, but Oedipus gets the answer right and so obligingly the sphinx kills itself instead. Having freed the people of Thebes from the sphinx, Oedipus receives the hand of the recently widowed Jocasta in marriage. All is well for a while, but then it comes to pass that Jocasta learns who Oedipus really is, and hangs herself. Then, later again, Oedipus discovers that he was the murderer of his own father, and gouges his own eyes out.

11 Sigmund Freud, The Interpretation of Dreams, chapter V, “The Material and Sources of Dreams”

12 From an essay by C.G. Jung published in CW XI, Para 520. The word ‘Raca’ is an insult translated as ‘worthless’ or ‘empty’ taken from a passage in the Sermon on the Mount from Matthew 5:22.

13 Jung described the shadow in a key passage as “that hidden, repressed, for the most part inferior and guilt-laden personality whose ultimate ramifications reach back into the realm of our animal ancestors…If it has been believed hitherto that the human shadow was the source of evil, it can now be ascertained on closer investigation that the unconscious man, that is his shadow does not consist only of morally reprehensible tendencies, but also displays a number of good qualities, such as normal instincts, appropriate reactions, realistic insights, creative impulses etc”

From Jung’s Collected Works, 9, part 2, paragraph 422–3.

14 In response to a question in an interview completed just two years before his death by John Freeman and broadcast as part of the BBC Face to Face TV series in 1959. Asking about his childhood and whether he had to attend church, he then asked: “Do you now believe in God?” Jung replies: “Now? Difficult to answer… I know. I don’t need to believe I know.”

15 The quote in full reads: “This is the only story of mine whose moral I know. I don’t think it’s a marvelous moral, I just happen to know what it is: We are what we pretend to be, so we must be careful about what we pretend to be.” From Mother Night (1962) by Kurt Vonnegut.

16 In his follow-up to the more famous On the Origin of Species (1959) and The Descent of Man (1871), Charles Darwin reported in Chapter VI entitled “Special Expressions of Man: Suffering and Weeping” of his three major work The Expression of the Emotions in Man and Animals (1872), that:

I was anxious to ascertain whether there existed in any of the lower animals a similar relation between the contraction of the orbicular muscles during violent expiration and the secretion of tears; but there are very few animals which contract these muscles in a prolonged manner, or which shed tears. The Macacus maurus, which formerly wept so copiously in the Zoological Gardens, would have been a fine case for observation; but the two monkeys now there, and which are believed to belong to the same species, do not weep. Nevertheless they were carefully observed by Mr. Bartlett and myself, whilst screaming loudly, and they seemed to contract these muscles; but they moved about their cages so rapidly, that it was difficult to observe with certainty. No other monkey, as far as I have been able to ascertain, contracts its orbicular muscles whilst screaming.

The Indian elephant is known sometimes to weep. Sir E. Tennent, in describing these which he saw captured and bound in Ceylon, says, some “lay motionless on the ground, with no other indication of suffering than the tears which suffused their eyes and flowed incessantly.” Speaking of another elephant he says, “When overpowered and made fast, his grief was most affecting; his violence sank to utter prostration, and he lay on the ground, uttering choking cries, with tears trickling down his cheeks.” In the Zoological Gardens the keeper of the Indian elephants positively asserts that he has several times seen tears rolling down the face of the old female, when distressed by the removal of the young one. Hence I was extremely anxious to ascertain, as an extension of the relation between the contraction of the orbicular muscles and the shedding of tears in man, whether elephants when screaming or trumpeting loudly contract these muscles. At Mr. Bartlett’s desire the keeper ordered the old and the young elephant to trumpet; and we repeatedly saw in both animals that, just as the trumpeting began, the orbicular muscles, especially the lower ones, were distinctly contracted. On a subsequent occasion the keeper made the old elephant trumpet much more loudly, and invariably both the upper and lower orbicular muscles were strongly contracted, and now in an equal degree. It is a singular fact that the African elephant, which, however, is so different from the Indian species that it is placed by some naturalists in a distinct sub-genus, when made on two occasions to trumpet loudly, exhibited no trace of the contraction of the orbicular muscles.

The full text is uploaded here:

17 Quote from, Leviathan (1651), The First Part, Chapter 6, by Thomas Hobbes (with italics and spelling as original). Hobbes continues:

And it is incident most to them, that are conscious of the fewest abilities in themselves; who are forced to keep themselves in their own favour, by observing the imperfections of other men. And therefore much Laughter at the defects of others is a signe of Pusillanimity. For of great minds, one of the proper workes is, to help and free others from scorn; and compare themselves onely with the most able.

Interestingly, Hobbes then immediately offers his account of weeping as follows:

On the contrary, Sudden Dejection is the passion that causeth WEEPING; and is caused by such accidents, as suddenly take away some vehement hope, or some prop of their power: and they are most subject to it, that rely principally on helps externall, such as are Women, and Children. Therefore, some Weep for the loss of Friends; Others for their unkindnesse; others for the sudden stop made to their thoughts of revenge, by Reconciliation. But in all cases, both Laughter and Weeping, are sudden motions; Custome taking them both away. For no man Laughs at old jests; or Weeps for an old calamity.

18 “Il n’y a qu’un problème philosophique vraiment sérieux : c’est le suicide.” Quote taken from The Myth of Sisyphus (1942) by Camus, Albert.  Translated by Justin O’Brien.

19 In Greek mythology Sisyphus was punished in hell by being forced to roll a huge boulder up a hill only for it to roll down every time, repeating his action for eternity. In his philosophical essay The Myth of Sisyphus (1942) Camus compares this unremitting and unrewarding task of Sisyphus to the lives of ordinary people in the modern world, writing:

“The workman of today works every day in his life at the same tasks, and this fate is no less absurd. But it is tragic only at the rare moments when it becomes conscious.”

In sympathy he also muses on Sisyphus’ thoughts especially as he trudges in despair back down the mountain to collect the rock again. He writes:

“You have already grasped that Sisyphus is the absurd hero. He is, as much through his passions as through his torture. His scorn of the gods, his hatred of death, and his passion for life won him that unspeakable penalty in which the whole being is exerted toward accomplishing nothing. This is the price that must be paid for the passions of this earth. Nothing is told us about Sisyphus in the underworld. Myths are made for the imagination to breathe life into them.”


“It is during that return, that pause, that Sisyphus interests me. A face that toils so close to stones is already stone itself! I see that man going back down with a heavy yet measured step toward the torment of which he will never know the end. That hour like a breathing-space which returns as surely as his suffering, that is the hour of consciousness. At each of those moments when he leaves the heights and gradually sinks toward the lairs of the gods, he is superior to his fate. He is stronger than his rock.

“If this myth is tragic, that is because its hero is conscious. Where would his torture be, indeed, if at every step the hope of succeeding upheld him? The workman of today works everyday in his life at the same tasks, and his fate is no less absurd. But it is tragic only at the rare moments when it becomes conscious. Sisyphus, proletarian of the gods, powerless and rebellious, knows the whole extent of his wretched condition: it is what he thinks of during his descent. The lucidity that was to constitute his torture at the same time crowns his victory. There is no fate that can not be surmounted by scorn.”

You can read the extended passage here:

20 Søren Kierkegaard never actually coined the term “leap of faith” although he did use the more general notion of “leap” to describe situations whenever a person is faced with a choice that cannot be fully justified rationally. Moreover, in this instance the “leap” is perhaps better described as a leap “towards” or “into” faith that finally overcomes what Kierkegaard saw as an inherent paradoxical contradiction between the ethical and the religious. However, Kierkegaard never advocates “blind faith”, but instead recognises that faith ultimately calls for action in the face of absurdity.

In Part Two, “The Subjective Issue”, of his 1846 work and impassioned attack against Hegelianism, Concluding Unscientific Postscript to the Philosophical Fragments (Danish: Afsluttende uvidenskabelig Efterskrift til de philosophiske Smuler), which is known for its dictum, “Subjectivity is Truth”, Kierkegaard wrote:

“When someone is to leap he must certainly do it alone and also be alone in properly understanding that it is an impossibility… the leap is the decision… I am charging the individual in question with not willing to stop the infinity of [self-]reflection. Am I requiring something of him, then? But on the other hand, in a genuinely speculative way, I assume that reflection stops of its own accord. Why, then, do I require something of him? And what do I require of him? I require a resolution.”

21 R. D. Laing, The Politics of Experience  (Ballantine Books, N.Y., 1967)

22 Quoted from the book known as Zhuangzi (also transliterated as Chuang Tzu or Chuang Chou). Translation by Lin Yutang

23 Although in all likelihood a reworking of a passage from a book entitled The Metaphoric Mind: A Celebration of Creative Consciousness written by Bob Samples and published in 1976 in which the fuller passage reads [with emphasis added]:

“The metaphoric mind is a maverick. It is as wild and unruly as a child. It follows us doggedly and plagues us with its presence as we wander the contrived corridors of rationality. It is a metaphoric link with the unknown called religion that causes us to build cathedrals — and the very cathedrals are built with rational, logical plans. When some personal crisis or the bewildering chaos of everyday life closes in on us, we often rush to worship the rationally-planned cathedral and ignore the religion. Albert Einstein called the intuitive or metaphoric mind a sacred gift. He added that the rational mind was a faithful servant. It is paradoxical that in the context of modern life we have begun to worship the servant and defile the divine.

24 The book is subtitled The Divided Brain and the Making of the Western World


Filed under « finishing the rat race »

keep taking the tablets

The following article is Chapter Four of a book entitled Finishing The Rat Race which I am posting chapter by chapter throughout this year. Since blog posts are stacked in a reverse time sequence (always with the latest at the top), I have decided that the best approach is to post the chapters in reverse order.

All previously uploaded chapters are available (in sequence) by following the link above or from category link in the main menu, where you will also find a brief introductory article about the book itself and why I started writing it.


“Psychiatry could be, or some psychiatrists are, on the side of transcendence, of genuine freedom, and of true human growth. But psychiatry can so easily be a technique of brainwashing, of inducing behaviour that is adjusted, by (preferably) non-injurious torture. In the best places, where straitjackets are abolished, doors are unlocked, leucotomies largely forgone, these can be replaced by more subtle lobotomies and tranquillizers that place the bars of Bedlam and the locked doors inside the patient.”  

— R. D. Laing in a later preface to The Divided Self. 1


A few notes of caution before proceeding:

From this point onwards I shall use the words ‘madness’ and ‘insanity’ interchangeable and to denote mental illness of different kinds in an entirely general and overarching way. Beyond the shorthand, I have adopted this approach for two principle reasons. Given the nature of the field and on the basis of historical precedent, technical labels tend to be transitory and superseded, and so traditional and non-technical language avoids our need to grapple with the elaborate definitions found in medical directories of psychiatry (more later), while taking this approach also keeps clear of the euphemism treadmill. Moreover, the older terms have simplicity which, if used with sensitivity, bestow weight on the day-to-day misery of mental illness and dignify its suffering. R. D. Laing, who spent a lifetime treating patients with the most severe schizophrenia, unflinching talked about ‘madness’. A flawed genius, I return to Laing in the final section of the chapter.

The second point I wish to highlight is that illnesses associated with the workings of the mind, will sadly, but in all likelihood, remain a cause for social prejudice and discrimination. In part this is due to the detrimental effect mental illness has on interpersonal relationships. And since ‘the person’ – whatever this entity can be said to fully represent – is presupposed to exist in a kind of one-to-one equivalence to the mind, it is basically taken for granted not only that someone’s behaviour correlates to unseen mental activity, but that it is an expression of a person’s character. Indeed, person, mind and behaviour are usually apprehended as a sort of coessential three-in-one.

All forms of suffering are difficult to face, of course, for loved ones as for the patient; however our degree of separation becomes heightened once someone’s personality is significantly altered through illness. I contend however that beyond these often practical concerns, there are further barriers that lie in the way of our full acceptance of mental illness, ones automatically instilled by everyday attitudes and opinions that may cause us to register a greater shock when faced with the sufferings of an unsound mind; some features of the disease not just directly clashing with expectations of acceptable human behaviour, but threatening on occasion to fundamental notions of what it means to be human.

For these reasons mental illness tends to isolate its victims. Those who in all likelihood are suffering profound existential detachment becoming further detached from ordinary human contact. In extreme circumstances, mental illness makes its victims appear as monstrosities – the freaks who ordinary folks once visited asylums simply to gawp at when it only cost a shilling to see “the beasts” rave at Bedlam, as London’s Bethlem Royal Hospital was once known. 2 Whom the gods would destroy they first make mad, the ancient saying goes 3, and it is difficult indeed to conjure up any worse fate than this.


Before returning to the main issues around mental illness, I wish to briefly consider the changing societal attitudes toward behaviour in general. The ongoing trend for many decades has been for society to become more tolerant of alternative modes of thinking and acting. Indeed, a plethora of interpersonal norms have either lapsed altogether or, are now regarded as old-fashioned and outmoded, with others already in the process of slow abandonment. For successive generations, the youth has looked upon itself as more liberated than its parents’ generation which it then regards, rightly or wrongly, as repressive and rigid.

To cite a rather obvious example, from the 1950s onwards sex has been gradually and almost totally unhitched from marriage and commensurate with this detachment there is more and more permission – indeed encouragement – to be sexual experimental: yesterday’s magnolia has been touched up to include a range of fifty thousand shades of grey! 4 But the zone of the bedroom is perhaps the exception rather than the rule, and outside its liberally sanctioned walls much that was seen as transgressive remains so and in fact continues to be either prohibited by law or else proscribed by customs or just ‘plain common sense’ – thus we are constrained by restrictions sometimes imposed for perfectly sound reasons plus others that lack clear ethical or rational justification.

Arguably indeed, there are as many taboos today as yesterday that inform our oftentimes odd and incoherent relationships to our own bodies and minds. As another illustrative example, most of us have probably heard how the Victorians were so prudish that they would conceal the nakedness of their piano legs behind little skirts of modesty (in fact an urban myth), when it is surely more scandalous (at least by today’s standards) that over the counter at the local apothecary drugs including laudanum (tincture of opium) were freely available to all.

It seems indeed that just as we loosened restraints on sexuality, new anxieties began to spring up concerning our relationship with our bodies as such. Suddenly perhaps we had more to measure up to, especially once all the bright young (and rather scantily-clad) things began to parade themselves indecorously if alluringly throughout our daily lives: ubiquitous in movies, on TV, billboards, and in magazines and newspapers. The most intriguing aspect of this hypersexualisation, however, is that modern society has simultaneously remained prudish in many other regards, most curiously in the case of public nudity; an ‘indecency’ that goes completely unrecognised within so-called primitive societies.

In parallel with these changes, our own culture, which increasingly fixates on youthfulness, has simultaneously fallen into the habit of marginalising old age and death. Not that death, as often presumed, now represents our final unuttered taboo, because arguably more shunned even than death is madness, and presumably because its spectre remains so uniquely terrifying to us.

The overarching point is that no society, however permissive, is ever well-disposed toward individuals who fail to measure up to established norms. The rule is perfectly straightforward in fact: in every society and throughout historical times, social deviants are prone to be ostracised. And as a rule, this applies whether one’s behavioural aberrance is a matter of personal choice or not.

I conjecture, moreover, that our abhorrence of madness is actually informed by the very biological classification of our species and sub-species: Homo Sapiens Sapiens. The wise, wise man! By which we discreetly imply (in our determinedly positivist account) the rational, rational man! Thus, to “lose your mind”, as we often say colloquially, involves the loss of the singular vital faculty – dare I say our ‘essential’ faculty? – The very thing that taxonomically defines us.

Of course, we are trespassing on hugely controversial territory and into areas I am (by profession) totally unqualified to enter. This must be conceded, whilst nevertheless, I do have privileged access when it comes to entering and exploring the field, as do you. Because we all have insider knowledge and deeply vested interest when it comes to comprehending the intricate activities of human consciousness, while no-one has the superhuman immunity that ensures perfect mental health – indeed, most people quietly experience episodes, whether passing or more prolonged, when our minds may go a little wonky.

Lastly then, my real purpose is not to dwell on what madness may be, but, arguably more importantly, to consider the consequences of being treated as mad; and in both senses of ‘treated’. So let’s just slip into these white coats. Ready…? Now to begin some informal examination of this rather delicate matter that is of such immediate and absolutely central importance.


I        Sorting the sheep from the goats

“Not all who rave are divinely inspired” – Morris Raphael Cohen


“The sole difference between myself and a madman is the fact that I am not mad!” said Salvador Dalí. 5 Dalí, with his dangerous flare for showmanship, was keen to impress upon his audience the exceptionally deranged quality of his genius, yet this well-known quip appeals in part because genius and madness are already romantically entwined, especially in the popular imagination.

Genius equates to madness presumably because both elude ordinary forms of thinking, and thus, a rather banal accountancy goes: genius appears as madness when it is anything but. Alternatively, however, and as Dali intimates, genius truly is a form of madness, at least for some. The artistic visionary in particular draws inspiration, if not upon literal hallucinatory visions – as the poet William Blake did – then from the upwelling of deep and uncertain psychological forces within.

Fascinated by the half-light and the liminal, impelled upon occasion to peer into the abyss, the genius in extreme cases, will indeed tread close to the verge of madness. Yet, most geniuses have not gone mad, nor does genius seem especially vulnerable or susceptible to such self-destructive forces. Even amongst greatest artists, exceptions prove to be the rule – the manic depression of Vincent van Gogh, the profound melancholia of Robert Schumann, the self-destructive alcoholism of Jackson Pollack (and it is noteworthy that van Gogh had a taste for the more deadly alcoholic beverage absinthe), the severe neurosis of Edvard Munch (another excessive drinker), and the depression and tragic suicide of Sylvia Plath. There is nothing however to suggest that Shakespeare or Bach were anything other than entirely sane, or that Mozart, Goethe and Beethoven suffered from frailties or maladies of any lasting psychological kind. The same goes for such modern masters as Picasso, Matisse, Stravinsky, and Mahler – though Mahler did consult Sigmund Freud once for advice on a marital crisis shortly before he died. I could go on and on listing countless sane individuals who excelled in the field of the arts or in other disciplines – indeed Salvador Dalí was another: madness for Dalí being primarily an affectation, as cultured and considered as his trademark moustache, rather than a debilitating affliction.

The problem with all romanticised notions of insanity, especially when upholding insanity as the more honest and thus valid conception of an insane world, is twofold. Not only does it detract from the terrible suffering of those victims most truly lost to the world, but also, and vitally, it mistakes madness for freedom. And there is still a further step. Since madness appears to be a natural manifestation, the most extreme of romanticists have more fervently contended that rather than delusionary, such alternative awareness is no less valid, indeed more valid, than more normalised and thus artificial states of domesticated consciousness. This is a wonderfully tempting fancy for all of us who’ve ever had concerns over a loosening “grip on reality”. Consider, for instance, the following syllogistic fallacy: all geniuses are mad, I’m mad ergo…

But this again is a very lazy method for cancelling madness, in which unpleasant reality is cheaply dismissed basically out of arithmetic convenience, and the two negatives – the horrors of the world and the terrors of the mind – are determined to add to zero. It simply isn’t good enough to say that madness doesn’t exist, or that madness does exist but it is natural and thus wholesome, or even that madness is really just sanity in disguise. That said, and albeit in a more inspirational way, Dalí is speaking for most of us. For the greatest barrier keeping many of us outside the padded cell is that, like him, “we are not mad”.


If sanity and insanity exist, how shall we know them? The question is neither capricious nor itself insane.

So begins a paper published by the journal Science in January 1973 and written by David L. Rosenhan, a Professor of Psychology at Stanford University. The “Rosenhan experiment”, as it is now known, had in fact involved two related studies, the first of which was certainly one of the most daring ever conducted in the social sciences.

Rosenhan would send seven mentally healthy volunteers, with himself making eight, on a mission to be admitted as patients within the American psychiatric system. These eight courageous ‘pseudopatients’ soon after arrived at the doors of selected hospitals with instructions to say only that they were hearing a voice which pronounced these three words: “empty”, “hollow” and, most memorably, “thud”. If admitted the volunteers were then further instructed to act completely normally and say that had had no recurrence of those original symptoms. 6

What transpired came as a surprise, not least to Rosenhan himself. Firstly, although none of the volunteers had any prior history of mental illness and none were exhibiting behaviour that could be deemed seriously pathological in any way – Rosenhan having ensured that “[t]he choice of these symptoms was also determined by the absence of a single report of existential psychoses in the literature” – every one of his ‘pseudopatients’ were admitted and so became real patients. More alarmingly, and as each quickly realised, they had landed themselves in a seemingly intractable catch-22 situation: for how does anyone prove their sanity, once certified insane?

If you say that you are fine, then who is to decide whether or not your expressed feelings of wellness are not delusional? It was certainly not lost on Rosenhan that this is a position all psychiatric patients inevitably find themselves in. In the event, it would take the eight ‘pseudopatients’ almost three weeks on average (19 days to be precise, and in one instance 52 days) to convince the doctors that they were sane enough to be discharged. But it didn’t end there, because all but one were finally discharged with a diagnosis of schizophrenia “in remission”, and as Rosenhan notes:

The label “in remission” should in no way be dismissed as a formality, for at no time during any hospitalization had any question been raised about any pseudopatient’s simulation. Nor are there any indications in the hospital records that the pseudopatient’s status was suspect. Rather, the evidence is strong that, once labeled schizophrenic, the pseudopatient was stuck with that label. If the pseudopatient was to be discharged, he must naturally be “in remission”; but he was not sane, nor, in the institution’s view, had he ever been sane. 7

For a second experiment, Rosenhan then cleverly turned the tables. With results from his first test released, he now challenged a different research and teaching hospital where staff fervently denied that they would have made comparable errors, telling them that over the period of three months he would send an undisclosed number of new ‘pseudopatients’ and it was up to them to determine which patients were the imposters. Instead Rosenhan sent no one:

Judgments were obtained on 193 patients who were admitted for psychiatric treatment. All staff who had had sustained contact with or primary responsibility for the patient – attendants, nurses, psychiatrists, physicians, and psychologists – were asked to make judgments. Forty-one patients were alleged, with high confidence, to be pseudopatients by at least one member of the staff. Twenty-three were considered suspect by at least one psychiatrist. Nineteen were suspected by one psychiatrist and one other staff member. Actually, no genuine pseudopatient (at least from my group) presented himself during this period. 8

Rosenhan provocatively although accurately titled his paper “On being sane in insane places”. The results of his study had not only undermined the credibility of the entire psychiatric establishment, but his main conclusion that “we cannot distinguish the sane from the insane in psychiatric hospitals”, touched on a far bigger issue. For aside from challenging existing methods of diagnosis, and calling into question the treatment and stigmatisation of mental illness – in view of what he described in the paper as “the stickiness of psychodiagnostic labels” 9 – the results of his study more fundamentally (and thus controversially) cast doubt on how psychological ‘normality’ can ever be differentiated decisively from ‘abnormality’ in all instances? Buried within his paper, Rosenhan posits:

… there is enormous overlap in the behaviors of the sane and the insane. The sane are not “sane” all of the time. We lose our tempers “for no good reason.” We are occasionally depressed or anxious, again for no good reason. And we may find it difficult to get along with one or another person –  again for no reason that we can specify. Similarly, the insane are not always insane.

So the ‘sane’ are not always ‘sane’ and the ‘insane’ are not always ‘insane’, although Rosenhan never leaps to the erroneous conclusion (as others have and do) that there is no essential difference between sanity and insanity. He simply responds to the uncomfortable facts as revealed by his studies and implores other professionals who are involved in care and treatment of psychiatric patients to be extra vigilant. Indeed, he opens his paper as follows:

To raise questions regarding normality and abnormality is in no way to question the fact that some behaviors are deviant or odd. Murder is deviant. So, too, are hallucinations. Nor does raising such questions deny the existence of the personal anguish that is often associated with “mental illness.” Anxiety and depression exist. Psychological suffering exists. But normality and abnormality, sanity and insanity, and the diagnoses that flow from them may be less substantive than many believe them to be.

So though his albeit small experiment had objectively undermined the credibility of both the academic discipline and clinical practice of psychiatry, his conclusions remained circumspect (no doubt he wished to tread carefully), with the closing remarks to his paper as follows:

I and the other pseudopatients in the psychiatric setting had distinctly negative reactions. We do not pretend to describe the subjective experiences of true patients. Theirs may be different from ours, particularly with the passage of time and the necessary process of adaptation to one’s environment. But we can and do speak to the relatively more objective indices of treatment within the hospital. It could be a mistake, and a very unfortunate one, to consider that what happened to us derived from malice or stupidity on the part of the staff. Quite the contrary, our overwhelming impression of them was of people who really cared, who were committed and who were uncommonly intelligent. Where they failed, as they sometimes did painfully, it would be more accurate to attribute those failures to the environment in which they, too, found themselves than to personal callousness. Their perceptions and behaviors were controlled by the situation, rather than being motivated by a malicious disposition. In a more benign environment, one that was less attached to global diagnosis, their behaviors and judgments might have been more benign and effective. 10


Before pursuing this matter by delving into deeper complexities, I would like to reframe the central concept almost algebraically. In this regard I am taking the approach of the stereotypical physicist in the joke, who when asked how milk production on a diary farm might be optimised, sets out his solution to the problem as follows: “Okay – so let’s consider a spherical cow…” 11

By applying this spherical cow approach to psychiatry, I have produced the following three crude equivalences, which are listed below (each accompanied by brief explanatory notes).

#1. Insanity = abnormality

Normality, a social construct [from etymological root ‘right-angled’], implies conventionality, conformity and being in good relation to the orthodoxy [from orthos ‘straight or right’] such that a person is adjudged sane when they appear to be well-balanced, rational, and functional.

#2. Insanity = unhealthiness

Health, a medical consideration [from root ‘whole’] indicates a lack of pathology and in this case emphasises something akin to good mental hygiene. ‘Health’ in the sense of mental health will correspond to low levels of stress and anxiety; high self-awareness and self-assuredness; to happiness and well-being.

And lastly,

#3. Insanity = psychological maladjustment to reality [from late Latin realis ‘relating to things’], with emphasis here placed on authenticity and realism as opposed to fantasy and delusion.

There is, of course, a good measure of crossover between these three pseudo-identities. For instance, if you are ‘normal’ (i.e., adjusted to society) then you have a greater likelihood of being ‘happy’ than if you are at variance. Moreover, if you’re well-adjusted socially, society as a whole will likely attest to you being ‘well adjusted’ in a broader psychological sense, because ‘reality’ is always to some extent socially construed. Imagine, for instance, being suddenly transported to the caste ossified and demon-haunted worlds of the Late Middle Ages; would the people determined sane today be thought sane as they disembarked from our imagined time machine, and would they stay sane for long? 12

I have included this rather crude and uncertain section in order to highlight how appearances of ‘madness’ and ‘sanity’ can often be coloured by alternative societal interpretations. As we venture forward, keep this in mind too: societal influences that shape and inform the prevailing notions of ‘normality’, ‘reality’ and even ‘happiness’ are more often latent than manifest.

“Happiness”: the story of a rodent’s unrelenting quest for happiness and fulfilment by Steve Cutts.


Did you ever stride warily over the cracks in the pavement? Have you crossed your fingers, or counted magpies, or stepped around a ladder, or perhaps ‘touched wood’ to ward off some inadvertently tempted fate? Most of us have. Are we mad? Not really, just a little delusional perhaps. Though does superstition itself contain the kernel of madness?

What if that compulsion to step across the cracks becomes so tremendous that the pavement exists as a seething patchwork of uncertain hazards? Or if we really, really feel the urge to touch the wooden object over and over until our contact is quite perfect and precise. When the itch is so irresistible and the desire to scratch quite unbearable, this otherwise silly superstition embroils the sufferer (today diagnosed with Obsessive Compulsive Disorder or OCD) in extended rituals that must be fastidiously completed; a debilitating affliction in which everyday routine becomes a torment as life grinds nearly to a halt, the paralysed victim reduced to going round and round interminably in the completely pointless loops of their own devising: life reduced to a barmy and infuriating assault course that is nearly impossible to complete.

As a child, to entertain yourself, did you ever look out for familiar shapes within the amorphous vapour of clouds or the random folds of a curtain? Doubtless you looked up into the night sky to admire the ‘Man in the Moon’, or if you are Chinese, then to spot the rabbit. Both are wrong, and right – connecting the dots being a marvellous human capacity that allows us to be creators extraordinaire. Yet the same aptitude holds the capacity to drive us literally crazy. How about those monsters at the back of your wardrobe or lurking in wait under the bed… and did the devil live around the U-bend of the toilet ready to leap out and catch you if you failed to escape before the flush had ended? It is fun to indulge in such fantasies. Everyone loves a ghost story.

Not that reconstructing faces or other solid forms where none exist involves hallucinating in the truest sense. However, these games, or harmless tics of pattern recognition – which psychologists call pareidolia – do involve our latent faculty for hallucinations – a faculty that is more fully expressed in dreams or just as we are falling asleep and during waking; images technically described as hypnagogic and hypnopomptic respectively. Some of us also hear imaginary things: and not only “things that go bump in the night”, but occasionally things that go bang upon waking (or on the brink of sleeping). This highly disconcerting experience even has the technical name “exploding head syndrome” – just to let you know, in case you ever suffer from it. Alongside still more frightening and otherworldly apparitions (the worst ones are usually associated with sleep paralysis) auditory hallucinations happen to billions of perfectly sober and otherwise sane individuals.

In fact, it is now known that about one percent of people with no diagnosed mental health problem hear voices on a regular basis – this happens to be approximately equivalent to the number of people who are diagnosed with schizophrenia (and it is important to note here that while not all schizophrenics hear voices, nor is schizophrenia the single mental illness in which hearing voices is a symptom). Within the general population, still more of us have fleeting episodes of hearing voices, while very nearly everyone will at some time experience the auditory hallucination of voices on the brink of sleep and waking.

Of course in a different though related sense, we all hear voices: the familiar inner voice that speaks softly as we think, as we read and perhaps as we console ourselves. And how many of us articulate that voice by talking to ourselves from time to time? As young children between the ages of two to eight we all would have done so. Then sometimes as we literally speak our minds, we also find ourselves listening attentively to what we ourselves just said aloud in these unaccompanied chinwags; although catching yourself fully in the act as an adult can often come as a bit of a shock – but a shock to whom exactly? So are we mad to talk to ourselves… or as the joke would have it, just seeking a more intelligent conversation!

In talking to ourselves we immediately stumble upon a remarkable and unexpected division in consciousness too. One—self becomes two selves. The ‘I’ as subjective knower abruptly perceiving a ‘me’ as a separate entity – perhaps this known ‘me’ perceived by the knower ‘I’ is deemed worthy of respect (but perhaps not, the knower can decide!) Curiously this is not just a mind becoming vividly aware of its existence as a manifestation (modern science would say ‘epiphenomenon’, as if this is an adequate explanation) of the brain-body (and such consciousness of the material self is strange enough), but the mind becoming literally self-aware and this self-awareness having endlessly self-reflecting origins, since if ‘I’ begin to think about ‘me’ then there can now exist a further ‘I’ which is suddenly aware of both the original knower and the already known. Fuller contemplation of this expanding hall of mirrors where the self also dwells is very possibly a road to madness: yet this habit of divorcing ‘I’ from ‘me’ is a remarkably familiar one. As usual, our language also gives us away: we “catch ourselves” in the act, afterwards commenting “I can’t believe I did it!” But what if our apprehension of the one—self becomes more broken still, and our sense of being can only be perceived as if refracted through shattered glass: the splintered fragments of the anticipated ‘me’ (whatever this is) appearing horrifically other?

Perhaps we’ve even had intimations of a feeling that we are entirely disconnected from every other part of the universe, and as such, then felt profoundly and existentially cast adrift with no recall of who we are. Such altered states of detachment are known in psychology as ‘dissociation’ and are not uncommon, especially to those with any appetite for ‘recreational substances’. Even alcohol is known to sometimes elicit temporary dissociation. And if these are representative of some of our everyday brushes with madness, then what of our more extended nocturnal lapses into full-blown irrationality: the hallucinations we call dreams and nightmares, and those altogether more febrile deliriums that occasionally take hold when we are physically ill?

These are the reflections of Charles Dickens, after one of his night walks brought on by insomnia led him to nocturnal contemplation of Bethlehem Hospital:

Are not the sane and the insane equal at night as the sane lie a dreaming? Are not all of us outside this hospital, who dream, more or less in the condition of those inside it, every night of our lives? Are we not nightly persuaded, as they daily are, that we associate preposterously with kings and queens, emperors and empresses, and notabilities of all sorts? Do we not nightly jumble events and personages and times and places, as these do daily? Are we not sometimes troubled by our own sleeping inconsistencies, and do we not vexedly try to account for them or excuse them, just as these do sometimes in respect of their waking delusions? Said an afflicted man to me, when I was last in a hospital like this, “Sir, I can frequently fly.” I was half ashamed to reflect that so could I by night. Said a woman to me on the same occasion, “Queen Victoria frequently comes to dine with me, and her Majesty and I dine off peaches and maccaroni in our night-gowns, and his Royal Highness the Prince Consort does us the honour to make a third on horseback in a Field-Marshal`s uniform.” Could I refrain from reddening with consciousness when I remembered the amazing royal parties I myself had given (at night), the unaccountable viands I had put on table, and my extraordinary manner of conducting myself on those distinguished occasions? I wonder that the great master who knew everything, when he called Sleep the death of each day’s life, did not call Dreams the insanity of each day`s sanity. 13

Meanwhile, obsessing over trifling matters is a regular human compulsion. The cap is off the toothpaste. The sink is full of dishes. That’s another tin gone mouldy in the fridge… during times when our moods are most fraught, seething with dull anger and impatient to explode at the slightest provocation, it is the fridge, sink, and the toothpaste that fills our head with troubles. Presumably again there is a limit beyond which such everyday obsessing becomes pathological. Indeed, I dare to suggest that obsessing over mundanities may be a kind of displacement activity: another distraction from the greatest unknown we all face – our certain endpoint with its dread finality. For we may, without lack of justification, dread our entire future; and with it the whole world outside our door: just as we may with due reason, based on past experiences, panic at the prospect of every encounter.

But whereas normal levels of fear act as a helpful defence mechanism and a necessary hindrance, the overbearing anxiety of the neurotic comes to stand in full opposition to life. Likewise, although indignation can be righteous and rage too is warranted on occasions, a constantly seething ill temper that seldom settles is corrosive to all concerned. In short, once acute anxiety and intense irritability worsen in severity and manifest as part of a chronic condition, life is irredeemably spoiled; in still greater severity, anxiety and anger will likely be attributed to symptoms of a psychiatric condition. The threshold to mental illness is once again crossed, but whereabouts was the crossing point?

Each of us has doubtless succumbed to moments of madness, and not just momentary lapses of reason, but perhaps entered into more extended periods when we have been caught up in obsessive and incoherent patterns of thought and behaviour. Loops of loopiness. Moreover, the majority of us will have had occasions of suicidal ideation, which again remain unspoken in part because they signal a psychological frailty that may point to a deeper pathology, or be mistaken as such. Because madness is not really such a faraway and foreign country, and even the sanest among of us (so far as this can be judged), are from time to time permitted entry at its gates.


II       Conspiracies against the laity

“That a dictator could, if he so desired, make use of these drugs for political purposes is obvious. He could ensure himself against political unrest by changing the chemistry of his subjects’ brains and so making them content with their servile condition. He could use tranquillizers to calm the excited, stimulants to arouse enthusiasm in the indifferent, halluciants to distract the attention of the wretched from their miseries. But how, it may be asked, will the dictator get his subjects to take the pills that will make them think, feel and behave in the ways he finds desirable? In all probabil­ity it will be enough merely to make the pills available.”

— Aldous Huxley 14


In earlier chapters I have discussed how science is soon out of its depth when it comes to understanding the mind and states of consciousness because the province of science is restricted to phenomena that not only can be observed and unambiguously categorised, but thereafter measured with known precision and modelled to an extent that is reliably predictive. Of course, hidden within that statement is an awful lot of maths, however, use of maths is not the issue here, measurement is.

For measurement becomes scientifically applicable once and only once there is a clear demarcation between the quantities we wish to measure. Length and breadth are easy to separate; time and space, likewise. The same case applies to many physical properties – all of the quantities that physicists and chemists take for granted in fact.

When we come to psychology and psychiatry we are likewise retrained. Brain-states are measurable and so we investigate these and then attempt to map our findings back onto sense-impressions, memories and moods. For instance, if we locate a region of the brain where these sense-impressions, memories and moods can be stimulated then we can begin the partial mapping of conscious experience onto brain-states. But we still have not analysed consciousness itself. Nor do we know how the brain-states permit volition – the choice of whether to move, and how and where to move, or, just as importantly, the freedom to think new thoughts. In short, how does our brain actually produce our states of minds, our personalities, and the entity we each call I? As neurologist Oliver Sacks noted in his book A Leg to Stand On in which he drew on his personal experience of a freak mountaineering accident to consider the physical basis of personal identity:

Neuropsychology, like classical neurology aims to be entirely objective, and its great power, its advances, come from just this. But a living creature, and especially a human being, is first and last active – a subject, not an object. It is precisely the subject, the living ‘I’, which is being excluded. Neuropsychology is admirable, but it excludes the psyche – it excludes the experiencing, active, living ‘I’ 15

We as yet have no grounds whatsoever to suppose that science will ever be able to objectively observe and measure states of consciousness. In fact, what would that actually entail? For we do not have even the slightest inkling what consciousness is, or, far more astonishingly, as yet understand how consciousness is routinely and reversibly switched off with use of general anaesthetics, even though general anaesthetics have been widely and effectively used in surgery for over a century and a half.

Moreover, having acknowledged its non-measureability, it is seen as permissible by some scientists to casually relegate consciousness to the status of an epiphenomenon. That is, science takes the singular certainty of our everyday existence and declines from taking any serious interest in its actual reality; in the most extreme case, proclaiming that it is purely illusory… Now think about that for a second: how can you have the ‘illusion of consciousness’? For what vehicle other than a conscious one can support or generate any kind of illusion at all? Although language permits us frame the idea, inherently it is self-contradictory, and proclaiming the illusoriness of consciousness is akin to deciding on the insubstantiality of substance or the unwetness of water.

South African psychoanalyst and neuropsychologist Mark Solms, who has devoted his career to reconnecting these scientific disciplines, here makes a persuasive case built upon studies of brain damaged patients that the source of consciousness cannot lie within the higher level cortex, as has been generally supposed, but instead involves mechanisms operating within the brain stem:

Furthermore, the literal root to our modern terms ‘psychology’, ‘psychoanalysis’ and ‘psychiatry’ is a derivative of the Greek word ‘psyche’ with its origins in ‘spirit’ and ‘soul’, and yet each of the disciplines have altogether abandoned this view in order to bring a strictly biomedical approach to questions of mind. No longer divorced from the brain, mind is thus presumed to be nothing more or less than outputs of brain function, and so the task of today’s clinicians becomes one of managing these outputs by means of physical or chemical adjustments. To these ends, the origins and causes of mental illness are often presumed to be fully intelligible and detectable in abnormalities of brain physiology and most specifically in brain chemistry – this is something I will discuss in greater detail.

Taking such a deeply biochemical approach to mental illness also leads inexorably to questions of genetics since there is no doubt that genes do predispose every person to certain illnesses, and so, with regards to the issue at hand, we might envisage some kind of psychological equivalent to the physical immune system. There is indeed no controversy in saying that the individual propensity to suffering mental illness varies, or that, if you prefer, we inherit differing levels of psychological immunity. Some people are simply more resilient than the average, and others less so and this difference in propensity – one’s ‘psychological immune system’ – is to some extent innate to us.

Of course, if genetic propensity was the primary determinant for rates of mental illness then within any given gene pool we ought to expect a steady level in the rates for diagnosis given that variations within any gene pool change comparatively slowly and over multiple generations. Evidently genetics alone cannot therefore explain any kind of sudden and dramatic rise in incidence of health problems, whether mental or otherwise. One note of caution here: the newer field of epigenetics may yet have something to add to this discussion.

But psyche, to return to the main point, is not a purely biological phenomenon determined solely by genetics, and other wholly material factors such as diet, levels of physical activity and so forth. For one thing, mind has an inherent and irreducible social component and this is the reason solitary confinement or similar forms of deprivation of social stimulus are exceedingly cruel forms of punishment. Taking the still more extreme step of subjecting a victim to the fullest sensory deprivation becomes a terrifying form of torture and one that rapidly induces psychological breakdown. All of this is well-established and yet still the scientific tendency is treat minds just as highly sophisticated programmes running on the wetware of our brains. But the wetware unlike the hardware and software of this computer in front of me possesses both subjectivity and agency. Put another way around: the brain isn’t the conscious agent; you are. And it is equally true to say, as the great theoretical physicist Max Planck elegantly pointed out, that consciousness is absolutely foundational:

I regard consciousness as fundamental. I regard matter as derivative from consciousness. We cannot get behind consciousness. Everything that we talk about, everything that we regard as existing, postulates consciousness. 16

Planck is precisely right to say we cannot get behind consciousness. And by everything he quite literally means everything including of course the brain, although unfortunately we are very much in the bad habit of forgetting this glaring fact.

With developments in neurology and biochemistry, science becomes ever more accomplished at measuring and, again with increasing refinement, is able to alter brain function, and in doing so, to alter states of consciousness. Yet even while a scientist or doctor is manoeuvring a patient’s mind, he remains deeply ignorant of how the change is achieved, and it is worth bearing in mind that methods for alteration of states of consciousness have been known and practiced throughout all cultures long before the advent of science.

To offer a hopefully useful analogy, when tackling problems of consciousness, our best scientists remain in the position of a motorist who lacks mechanical understanding. The steering wheel changes direction and two of pedals make the car go faster or slower – yet another pedal does something more peculiar again that we needn’t dwell on here! Of course, our imaginary driver is able to use all these controls to manoeuvre the car – increasingly well with practice. Added to which he is free to lift the bonnet and look underneath, however, without essential knowledge of engineering or physics, it provides no eye-opening additional insights. Although such an analogy breaks down (if you’ll pardon my pun), as every analogy here must, because as Planck says, when it comes to consciousness all our understanding of the world, all concepts, are contingent on it, including in this instance, the concept of mechanisms.

For these reasons we might quite reasonably ask which factors the psychiatrist ought to invest greater faith in: definite quantities or indefinite qualities? Measureable changes in electrical activity or a patient’s reports of mood swings? Rates of blood flow or recognisable expressions of anxiety? Levels of dopamine or the unmistakeable signs of the patient’s sadness and cheerfulness?

More philosophically, we might wonder deeply into what awareness is. How do we navigate the myriad nooks and crannies of the world that our minds (in a very real sense) reconstruct – our perceptions being projections informed by sensory inputs and produced to give the appearance of external reality – in order to inquire into the nature of both the world and the organs of perception and cognition when the precursory nature of awareness somehow remains tantalisingly beyond all reconstruction? When confronted by these questions science is struck dumb – it is dumbfounded. Obviously, so too is psychiatry.

Mathematician and Quantum Physicist Roger Penrose has devoted a great deal of time thinking about the nature of consciousness and in his best-selling book The Emperor’s New Mind (1989) he explained why science is wrong in presuming it is a purely computational process. In conversation with AI researcher Lex Fridman, here Penrose again stresses our lack of basic scientific understanding of consciousness and proffers his own tentative ideas about where we might begin looking and, in particular, how investigating the causal mechanisms underlying general anaesthetics looks a profitable place to begin:


In the early 1960s, tired of signing his name on the skin of naked women, transforming them instantly into living sculptures (and what’s not to like about that?), avant-garde Italian artist, Piero Manzoni turned his hand instead to canning his own excrement and selling his tins to galleries. In May 2007, a single tin of Manzoni’s faeces was sold at Sotheby’s for more than £100,000; more recently in Milan another tin of his crap fetched close to a quarter of a million! It would be madness, of course, to pay anything at all for bona fide excrement (and it remains uncertain whether Manzoni’s labels reliably informed his customers of their literal contents), was it not for the fact that other customers were queuing up and happy to pay as much or more. Indeed, if anyone can ever be said to have had the Midas touch, then surely it was Manzoni; just a flick of his wrist miraculously elevating anything at all to the canonised ranks of high art – literally turning shit into gold.

But then the art world is an arena that excels in perversity and so pointing out its bourgeois pretensions and self-indulgent stupidities has itself become a cheap pursuit, while to the initiated it simply marks me out as another unenlightened philistine. What is blindingly obvious to the rest of us has instead become undetectable to the connoisseur, the banality obscured by fashion and their own self-gratification. In an era that is exceptionally cynical and commercial, it comes as no surprise therefore to find the art world reflecting and extolling works of commensurate cynicism and degeneracy. What is more interesting, however, is this contemporary notion that art has finally become anything done by an artist: for we might reasonably ask, does this same approach to validation apply across other disciplines too? For instance, if scientists collectively decide to believe in a particular method or theory, does this automatically make their shared belief somehow ‘scientific’? I pose this as a serious question.

What is more important here is to understand and recognise how all intellectual fields face a similar risk of losing sight of what is inherently valuable, becoming seduced by collective self-deception and wrapped up in matters of collective self-importance. Peer pressure. Groupthink. The bandwagon effect. If you’ve never seen the footage before then I highly recommend watching Solomon Asch’s ‘conformity experiments’ in which test subjects were found to consistently and repeatedly defer to false opinion and in blatant contradiction to what they could see perfectly clearly and right in front their own eyes. 17

In short, most people will “go along to get along” and this maxim applies across all levels of society and in all spheres of activities including the sciences. Moreover, it is very seldom the case that any scientific paradigm changes because its opponents are suddenly won over by a novel framework of ideas due to its intrinsic elegance or power, but rather as Max Planck put it most bluntly (at least as it usually paraphrased): “Science progresses one funeral at a time”. 18

These problems are additionally compounded by reification: the mistaking of abstractions for solid aspects of reality; of confusing the map with the territory. Related to this is something William James once described as the “Psychologist’s fallacy”:

The great snare of the psychologist is the confusion of his own standpoint with that of the mental fact about which he is making his report. I shall hereafter call this the ‘psychologist’s fallacy’ par excellence. 19

There are actually three ways of interpreting James’ statement here and each of these is equally applicable. The first and most general cautions against mistaking one’s personal perception and interpretation of an event as a perfectly accurate account – this strictly applies to all fields of objective research. The next is that it is easy to mistake another person’s experience and falsely imagine it is identical to your own. This ‘confusion of standpoints’ can cause you to believe you know why someone did what they did believing they are motivated in just the same way you are. Then finally, there is an error that applies in situations whenever you are involved in studying another person’s mental state (for whatever reason and not necessarily in a clinical setting) and you suppose that the subject is likewise critically aware of their own thoughts and actions. This is called ‘attribution of reflectiveness’ and it may occur for instance if you come across someone blocking your way once you then presume that they are fully aware of the obstruction they have caused to your progress and are obviously being inconsiderate.

Besides the issues of groupthink and the fallacies outlined above, there is a related difficulty that arises whenever you are constrained by any systems of classification, and given how incredibly useful categories are (especially in the sciences), this is again hard to avoid. Whenever a system comes to be defined and accepted, the tendency will always be for adherents to look for and find examples that fit and support it; and if this means cherry-picking the facts then so be it. Within no time an entire discipline can spring up this way, as was the case of phrenology (a subject I shall come back to in a later chapter).


George Bernard Shaw nattily remarked that “all professions are conspiracies against the laity”. In the same spirit, we might extend his concern adding that such conspiracies will tend to feign understanding, disguise ambiguity and perpetuate fallacies. The quip itself comes from Shaw’s play The Doctor’s Dilemma, and was most pointedly aimed toward the medical profession. But then in defence of doctors, medicine as a discipline is arguably the science most plagued by vagueness; a nearly intractable problem given how symptoms of so many diseases can be easily muddled just because of their inherent similarities. Consider, for instance, the thousand and one ailments that all have “flu-like symptoms”.

In turn, patients are equally prone to vagueness when giving accounts of their own symptoms, in part because symptoms are often rather difficult to describe – just how do you distinguish the various feelings of pain, for instance. To make matters worse, human biology is already fiendishly complex. Textbooks provide only textbook examples: they show ideal anatomy, while real anatomies are seldom ideal and it is a surprisingly common occurrence for actual patients to have organs with structures or locations that are very markedly different.

The unavoidable outcome of all this uncertainty and peculiarity is that medical professionals do not understand nearly half so much as those without medical training are given to believe – and, importantly, choose to believe. Because, as patients, not only do we seek clear diagnoses, but we look to medicine for sure-fire remedies, all of which encourages an inclusion in medical nomenclature of elaborate – and preferably Latinised labels – for the full gamut of our daily complaints. A complete taxonomy that catalogues and accounts for every combination of symptoms and one or two half-glimpsed maladies. All of which brings us to the consideration of ‘syndromes’ and ‘disorders’.

When your doctor diagnoses abc-itis, then presuming the diagnosis is a correct one, it is very certain that you have inflammation of your abc. Diagnoses of thousands of complaints and diseases are absolutely clear-cut like this. However, if told you are suffering from xyz syndrome, it may mean instead that you are presenting a cluster of symptoms which are recognised to occur in a specific combination; a grouping that crops up often enough to have acquired its label ‘xyz syndrome’, rather than a disease with a well-established or single underlying cause. In short, the term ‘syndrome’ will sometimes hide a lot more than it reveals.

Whenever patterns of symptoms have been rolled together and labelled for the sake of convenience under a single catch-all name, here is the shorthand for saying we recognise the signs, and though can’t tell you the cause and as yet remain unable to recommend a cure, we are working on it! And if the shorthand was unavailable, then instead the clinician would have to shrug their shoulders and usher you away, which, given how patients usually have a strong preference for receiving (at the very least) a name for the cause of their suffering, this more customary exchange allows both parties to leave the consultation far happier. We are often content therefore to indulge our medical (and other experts) in maintaining many of these Shavian “conspiracies” against us.

Returning to consider psychiatry, it is necessary to appreciate that all but the rarest of psychiatric diagnoses fall under the category of ‘disorders’ rather than diseases – and that the underlying aetiology in many cases is not just unknown but more or less unconsidered. It follows that historically, the development of diagnosis and treatments has very often had recourse to little more than educated hunches and trial-and-error testing on (all-too often) unwilling patients.

As former National Institute of Mental Health (NIMH) Director, Thomas Insel, pointed out:

“While DSM [Diagnostic and Statistical Manual of Mental Disorders] has been described as a “Bible” for the field, it is, at best, a dictionary, creating a set of labels and defining each. The strength of each of the editions of DSM has been “reliability” – each edition has ensured that clinicians use the same terms in the same ways. The weakness is its lack of validity. Unlike our definitions of ischemic heart disease, lymphoma, or AIDS, the DSM diagnoses are based on a consensus about clusters of clinical symptoms, not any objective laboratory measure. In the rest of medicine, this would be equivalent to creating diagnostic systems based on the nature of chest pain or the quality of fever. Indeed, symptom-based diagnosis, once common in other areas of medicine, has been largely replaced in the past half century as we have understood that symptoms alone rarely indicate the best choice of treatment. Patients with mental disorders deserve better.” 20


Psychiatrist: Have you ever heard of the old saying “a rolling stone gathers no moss?”

Patient: Yeah.

Psychiatrist: Does that mean something to you?

Patient: Uh… it’s the same as “don’t wash your dirty underwear in public.”

Psychiatrist: I’m not sure I understand what you mean.

Patient: [smiling] I’m smarter than him, ain’t I? [laughs] Well, that sort of has always meant, is, uh, it’s hard for something to grow on something that’s moving.

If you’ve seen the film One Flew Over the Cuckoo’s Nest 21 then you may recognise the dialogue above. It comes when the central protagonist Randle McMurphy (brilliantly cast as the young Jack Nicholson) is subjected to a follow-up evaluation carried out by a team of three psychiatrists trying to determine whether or not he is fit enough to be discharged.

Released only a couple of years after Rosenhan and his ‘pseudopatients’ had sneaked under the diagnostic radar, and like Rosenhan and his associates, but for reasons which we need not go into, in the film McMurphy is an apparently sane inmate plunged into an infuriating and intractable catch-22 situation.

Now the question posed to McMurphy appears an odd one, yet questions of precisely this kind, commonly based around well known proverbs, were once regularly used for such diagnostic purposes. Just as with the better known Rorschach inkblot test, there is no single ‘correct’ answer, but there were built-in ways a patient might fail such an examination. In this case, responses considered too literal were taken as evidence of pathology on the grounds that they show an inability for the patient to think in ways other than concretely. Simply re-expressing the proverb in order to precisely account for how a rolling rock is an inhospitable environment for vegetation is therefore an ill-advised response.

Indeed, McMurphy’s second answer conclusively fails the test, whereas his first stab at saying something deliberately obtuse merely confuses the three doctors. Of course, in the film it is McMurphy’s deeply rebellious nature and truculent behaviour, rather than the results of tests of this sort that ultimately seal his fate – and again there is no need for details here, but merely to add that whilst the ramifications of Rosenhan’s experiment challenged opinions within academic and professional circles, the multiple Academy Award-winning One Flew Over the Cuckoo’s Nest, reached out to a far wider audience and helped to change the public perception of how we care for the mentally ill. Moreover, Rosenhan’s criticisms had been restrained, whereas the film – like the book – went straight for the jugular.

Author of the book “One Flew Over the Cuckoo’s Nest”, Ken Kesey, was involved with the film adaptation, but for a variety of reasons including a dispute over royalties, he left barely two weeks into production and has since claimed not to have watched the final version. Embedded below is a short interview with Kesey talking about the main characters and interspersed with relevant clips:

In the wake of Rosenhan’s experiment (1972) and Kesey’s fictional portrayal of life inside the asylum (published in 1962, released as a film in 1975), the ‘anti-psychiatry’ movement (a term coined by one of its most prominent advocates, South African psychiatrist David Cooper in 1967) soon began to gain political traction. With the legitimacy of mainstream psychiatry subject to sustained attack and very concept of mental illness suddenly coming under scrutiny, in the midst of this crisis, the American Psychiatric Association (APA) made a decision to release its new manual: a fully updated directory that would authoritatively categorise and thus authenticate all forms of ‘mental disorder’.

The Diagnostic and Statistical Manual of Mental Disorders – soon after known as ‘the bible of psychiatry’ – is now in its fifth edition, DSM-V, and with each updated edition it has become an ever weightier tome, expanding at a faster rate than almost any other technical manual in history. And this snowballing really started in 1968 when the revised second edition introduced an additional seventy-six ‘disorders’, thereby expanding the original 1952 catalogue by more than 70 percent. When revised again in 1980, the DSM-III added a further 83 diagnostic categories; its list growing from 182 (DSM-II) to 265 (DSM-III) – this represents a 150 percent increase on the original. Although less conspicuously, the same trend continued when DSM-IV was released in 1994, which catalogues a total of 410 disorders – almost a three-fold increase on the original.

James Davies is a Reader in Social Anthropology and Mental Health at the University of Roehampton, a psychotherapist, and co-founder of the Council for Evidence Based Psychiatry. In trying to understand how the present manual had come to be constructed he decided to speak to the many of authors directly, and so in May 2012 he took a trip to Princeton. There he was welcomed by Dr Robert Spitzer who had chaired the core team of nine people who put together the seminal third edition of the DSM, which amongst other things established the modern diagnostic system still broadly in operation. It was this edition of the manual that had introduced such household-name disorders as Borderline Personality Disorder and Post-Traumatic Stress Disorder. For these reasons, Spitzer is widely regarded as the most influential psychiatrist of the last century.

Davies began his interview by asking Spitzer what was the rationale behind his significant expansion in number of disorders in the DSM-III edition and Spitzer told him:

“The disorders we included weren’t really new to the field. They were mainly diagnoses that clinicians used in practice but which weren’t recognised by the DSM or the ICD.” 22

Davies then pressed further and asked how many of these disorders had been discovered in a biological sense. In reply Spitzer reminded him that “there are only a handful of mental disorders… known to have a clear biological cause” adding that these organic disorders like epilepsy, Alzheimer’s and Huntington’s are “few and far between”; conceding that no biological markers have been identified for any of the remaining disorders in DSM. With this established, Davies then asked how the DSM taskforce did determine which new disorders to include. Spitzer explained:

“I guess our general principle was that if a large enough number of clinicians felt that a diagnostic concept was important in their work, then we were likely to add it as a new category. That was essentially it. It became a question of how much consensus there was to recognise and include a particular disorder.” 23

Davies also spoke to Dr Theodore Millon, another of the leading lights on Spitzer’s taskforce, to ask more about the construction of their manual. Millon told him:

“There was little systematic research, and much of the research that existed was really a hodgepodge – scattered, inconsistent, and ambiguous. I think the majority of us recognised that the amount of good, solid science upon which we were making our decisions was pretty modest.” 24

Afterwards, Davies had put Millon’s points directly to Spitzer, who responded:

“Well it’s true that for many of the disorders that were added, there wasn’t a tremendous amount of research, and certainly there wasn’t research on the particular way that we defined these disorders… It is certainly true that the amount of research validating data on most psychiatric disorders is very limited indeed.”

Adding that:

“There are very few disorders whose definition was a result of specific research data.” 25

On the basis of Spitzer’s surprising admissions, Davies than tracked down other members of the same DSM team. For instance, he spoke on the phone to Professor Donald Klein, another leader on the taskforce, who said:

“We thrashed it out basically. We had a three-hour argument… If people [at the meeting] were still undecided the matter would be eventually decided by a vote.” 26

And Davies finally decided to check what he was hearing from these members by looking through the minutes of taskforce meetings which are still held in the archives, discovering that voting did indeed take place to make such determinations. Renee Garfinkel, a psychologist who participated in two DSM advisory subcommittees, told Davies more bluntly:

“You must understand what I saw happening in these committees wasn’t scientific – it more resembled a group of friends trying to decide where they want to go for dinner.”

She then cited the following concrete example of how one meeting had proceeded:

“As the conversation went on, to my great astonishment one Taskforce member suddenly piped up, ‘Oh no, no, we can’t include that behaviour as a symptom, because I do that!’ And so it was decided that that behaviour would not be included because, presumably, if someone on the Taskforce does it, it must be perfectly normal.” 27

Although comprised of a rather small team, DSM-III has had far-flung and long-lasting influence on psychiatry. Spitzer told Davies:

“Our team was certainly not typical of the psychiatry community, and that was one of the major arguments against DSM-III: it allowed a small group with a particular viewpoint to take over psychiatry and change it in a fundamental way.

“What did I think of that charge? Well, it was absolutely true! It was a revolution, that’s what it was. We took over because we had the power.” 28

In any case, reliance upon a single definitive and encyclopaedic work of this kind presents a great many hazards. As Allen Frances, the former chairman of the psychiatry department at Duke University School of Medicine who led the taskforce that produced DSM-IV has publicly admitted:

At its annual meeting this week [in May 2012], the American Psychiatric Association did two wonderful things: it rejected one reckless proposal that would have exposed nonpsychotic children to unnecessary and dangerous antipsychotic medication and another that would have turned the existential worries and sadness of everyday life into an alleged mental disorder.

But the association is still proceeding with other suggestions that could potentially expand the boundaries of psychiatry to define as mentally ill tens of millions of people now considered normal.

In the same op-ed published by the New York Times, Frances continued:

Until now, the American Psychiatric Association seemed the entity best equipped to monitor the diagnostic system. Unfortunately, this is no longer true. D.S.M.-5 promises to be a disaster — even after the changes approved this week, it will introduce many new and unproven diagnoses that will medicalize normality and result in a glut of unnecessary and harmful drug prescription. The association has been largely deaf to the widespread criticism of D.S.M.-5, stubbornly refusing to subject the proposals to independent scientific review.

Many critics assume unfairly that D.S.M.-5 is shilling for drug companies. This is not true. The mistakes are rather the result of an intellectual conflict of interest; experts always overvalue their pet area and want to expand its purview, until the point that everyday problems come to be mislabeled as mental disorders. Arrogance, secretiveness, passive governance and administrative disorganization have also played a role.

New diagnoses in psychiatry can be far more dangerous than new drugs. 29

In an earlier interview speaking with Wired magazine, Frances – credited as “the guy who wrote the book on mental illness” – made an even more startling confession, telling Gary Greenberg, who is himself a practicing psychotherapist:

“[T]here is no definition of a mental disorder. It’s bullshit. I mean, you just can’t define it… these concepts are virtually impossible to define precisely with bright lines at the boundaries.” 30


The entry of psychiatry into the province of science is a comparatively recent one. Indeed, in the ancient world and times prior to the Enlightenment, some severe forms of mental illness would most likely have appeared the work of demons. And if a person was believed to be possessed, then religious protocols, informed by the opinion that their soul was in existential peril and without intervention would suffer eternal damnation, called for extremely drastic measures.

Indeed, the very word psychiatry derives (as mentioned above) from the Greek psukhē for ‘breath, life, soul’ (Psyche also the Greek goddess of the Soul), though in accordance to the strict biomedical model of mind, psychiatry today takes no interest in these ‘spiritual’ matters. Nevertheless, the interventions of psychiatry to save a person’s mind have often been as drastic, and if anything crueller, than those inflicted throughout prior ages. The dark arts of exorcism or trepanning superseded and upgraded by the aid of technological means: the unfortunate victims, at first, subjected to induced convulsions by the administration of an overdose of insulin, then more latterly by means of high voltage electric shocks passed between the temples (electroconvulsive therapy or ECT). Still more invasive treatments were also introduced throughout the twentieth century that excised a patient’s demons by means of irreversible surgical mutilation.

When we retrace the short but terrible history of psychiatry, it is rather easy to overlook how many of these barbaric pseudoscientific treatments were once lauded as state-of-the-art. As recently as 1949, Portuguese neurologist António Egas Moniz actually shared the Nobel Prize for Medicine for his invention of a routine procedure for carrying out lobotomies; his original procedure refined by Moniz’s mentor, American neurologist Walter Freeman, who used an ice-pick hammered through the eye socket to sever the frontal lobes. Such horrific procedures were frequently performed without anaesthetic and led to the destruction of the minds – although I am tempted to say souls – of tens of thousands of people; the majority of whom were women (also predominant amongst victims were homosexuals). This use of so-called ‘psychosurgery’ was phased out gradually but lobotomies continued to be performed into the 1970s and even later. 31

Today it is also an open, if dirty, secret that throughout modern times, psychiatry has played a pivotal role in the coercion of political opponents of the state. Many authoritarian regimes – the former Soviet Union the most frequently cited – operating their mental health systems as a highly efficient means for cracking down on dissidents (who more or less by definition failed to think ‘normally’). The abuse of psychiatry by western governments is less known, however, at the height of the Cold War, the CIA carried out a whole range of experiments under Sidney Gottleib’s MKUltra mind control programme.

One of the MKUltra researchers was Ewan Cameron, the then-President of the American Psychiatric Association (APA), who went so far as to attempt to entirely erase his patients’ existing memories by means of massive doses of psychotropics and ECT in attempts to reprogram the victim’s psyche from scratch. Decades later, some the survivors won financial rewards as compensation for their part in this secret regime of state-sponsored torture. 32 Moreover, this very close collaboration between military intelligence services and the APA has continued and during the “War on Terror” a number of ‘operational psychologists’ are now known to have worked on CIA’s “enhanced interrogation” torture programme. 33

Of course, state coercion is not always to control political enemies. Minorities who have suffered discrimination for different reasons have likewise fallen victim to psychiatric abuse. In fact, prior to 1973, when homosexuality was designated a disease and placed on the list of ‘mental disorders’ according to the DSM ‘bible’, otherwise healthy gay men were forcibly subjected to treatments involving aversion ‘therapies’ that included electric shock to the genitals and nausea-inducing drugs administered simultaneously with the presentation of homoerotic stimuli. In the Anthony Burgess novel Clockwork Orange (1962) this was called “the Ludovico Technique”.

Thus historically, the insane subject – i.e., anyone who is diagnosed as mentally ill – has been uniquely deprived their basic human rights. Downgraded in social status and transformed de facto into a kind of second class human. Even today, when clinical procedures are kinder, patients are routinely subjected to many involuntary treatments including the long-term administration of powerful drugs and ECT.


Leaving aside the moral questions, this terrible history also casts a shadow over the whole science underpinning these treatments. What do we really know about the efficacy of ECT today that we didn’t know in the middle of the last century?

Or consider the now familiar labelling of drugs as ‘antipsychotic’ and ‘antidepressant’: terms that are wholly misleading and deeply unscientific, since the implication is that these are antidotes are much like antibiotics, acting to cure specific disease by targeting the underlying pathology. But this is entirely false, and the reason it is misleading can be best understood by once again reviewing the history of psychiatry.

Firstly, it is important to recognise that none of the first generation of psychiatric drugs was ever developed for the purpose either of alleviating neurological dysfunction or enhancing brain activity. Chlorpromazine (CPZ) – marketed under the brand names Thorazine and Largactil – the earliest of these ‘antipsychotics’ had previously been administered as an antihistamine to relieve shock in patients undergoing surgery, although it was in fact derived from a family of drugs called phenothiazines originally used as antimalarials and to combat parasitic worm infestations. 34

It had been noticed, however, that many of the patients who received Thorazine would afterwards manifest mood changes and in particular experience a deadening in their emotional response to the external world while otherwise retaining full consciousness. In short, the drug happened to reproduce the effects observed in patients who underwent a surgical lobotomy (which in 1950 was still considered a highly effective treatment for psychosis of course).

On the other hand, ‘antidepressants’ emerged as a by-product of research into tuberculosis, after it was noticed that some patients in the trials became more roused following their medication. Only in the aftermath of studies carried during in the 1960s, did science finally begin to understand how these pharmaceuticals were having direct effects within the brains of patients, and specifically on processes involving, respectively, the neurotransmitters dopamine and serotonin. In patients suffering psychosis there was found to be an excess of the former, whereas those suffering depression showed an apparent deficit of the latter. The conclusion followed that the drugs must have been acting to correct an existing imbalance, very much as insulin does in the case of diabetes.

So the conclusions from these early studies were drawn wholly from understanding the mechanism of action of the drugs. Since the antipsychotics were found to block dopamine receptors, the hypothesis formed that the condition of psychosis must be due to an excess of dopamine activity; likewise, since antidepressants held serotonin longer in the synaptic cleft (the space that separates and forms a junction between neurons) boosting the activity, it followed that depression was a result of low serotonin activity. However, this reasoning turns out to be inherently flawed, and as subsequent research had quickly revealed, actual differences in brain chemistry detected in patients were a feature not of the underlying pathology associated with their disorder, but instead a direct effect of the medications used to treat them. Indeed for decades, clued-up pharmacologists and many psychiatric practitioners have regarded the theory of ‘chemical imbalance’ not as a scientific model, but nothing more than a metaphor: a means of explaining the use of the treatment to patients as well as an encouragement.

This is what Ronald W. Pies, Editor-in Chief Emeritus of Psychiatric Times, wrote a decade ago about the ‘theory of chemical imbalance’:

“I am not one who easily loses his temper, but I confess to experiencing markedly increased limbic activity whenever I hear someone proclaim, “Psychiatrists think all mental disorders are due to a chemical imbalance!” In the past 30 years, I don’t believe I have ever heard a knowledgeable, well-trained psychiatrist make such a preposterous claim, except perhaps to mock it. On the other hand, the “chemical imbalance” trope has been tossed around a great deal by opponents of psychiatry, who mendaciously attribute the phrase to psychiatrists themselves. And, yes—the “chemical imbalance” image has been vigorously promoted by some pharmaceutical companies, often to the detriment of our patients’ understanding. In truth, the “chemical imbalance” notion was always a kind of urban legend – never a theory seriously propounded by well-informed psychiatrists.” 35

David Cohen is Professor of Social Welfare and Associate Dean for Research and Faculty Development at UCLA Luskin. His research looks at psychoactive drugs (prescribed, licit, and illicit) and their desirable and undesirable effects as socio-cultural phenomena “constructed” through language, policy, attitudes, and social interactions. Here he explains how psychiatry has painted itself into a corner and became unable to look for treatments for mental illness that lie outside the biomedical model, which treats all conditions of depression, anxiety and psychosis as brain disorders:

Today we have become habituated to the routine ‘medication’ of our youth with children as young as six years old being administered tranquilisers relabelled as ‘antidepressants’ and ‘antipsychotics’ that are intended ‘to cure’ dysfunctions like “oppositional defiant disorder”. These considerations bring us to the broader issue of what constitutes ‘mental health’, and by extension, what it is to be ‘normal’.

Moreover, it hardly needs saying that increased diagnosis and prescription of medication of every variety is demanded by the profit motive of pharmaceutical industry, so for now, I wish merely to add that we have no demonstrable proof that the identified rise in mental illness is wholly attributable to a commensurate rise in mental illness rather than an artefact bound up with the medicalisation of the human condition. However, given that mental health is expressly bound up with, and to a great extent defined by a person’s feelings of wellness, attempts to downgrade or dismiss patient testimony or to overrule personal accounts of psychological distress, declaring some parts of it illusory, are not only callous but another kind of category mistake. Whatever terminology we apply it is evident that more people than ever are suffering forms of psychological distress. I shall consider this at greater length in the final section.


Before continuing, I would like to introduce a genuinely serendipitous finding – a newspaper clipping torn out by someone I have never met, and left inside the cover of a second-hand book for reasons I shall never know. I cannot even reference this item because I have no idea in which newspaper it was originally printed, and so will simply label it Exhibit A (the author’s name is also redacted out of courtesy):

“Someone close to me has been smoking cannabis for many years,” the author tells us, adding “That person has never worked and lives in a state of euphoria.”

From these preliminary remarks it is actually hard to tell whether the writer is issuing a caution or an endorsement for pot smoking – or at least it would be hard to tell, were it not for our informed social prejudices, and since the presumed social norm is that work is always good and drugs (meaning illegal ones) unconditionally bad. Suppose, however, this surmised state of euphoria had been ascribed to quite different causes. Let’s say, for example, that the person in question was in love, or that s/he’d found God, or alternatively that s/he had been proscribed a legally sanctioned medicine lifting them from a prior state of depression and anxiety, and this lasting euphoria was the outcome. Would this not be a good thing?

But the next part of the letter is perhaps the most interesting part. It begins: “People on cannabis lose touch with reality. They cannot relate to normal life because they are in a perpetual state of relaxation, and doing everyday tasks or even getting up in the morning is a big deal. They drift from day to day.”

At this point, I ought to make a personal confession. The person described here is me – not me in all actuality, but another me, another drifter. It is me and a considerable number of my closest friends, who have spent a great many years smoking pot and “losing touch with reality”. Doubtless, it will describe the lives of some of the people who happen to read this too. Personally, I gave up smoking pot years ago for health reasons, and I do not advise others to follow my lead either way. Undeniably, there is some truth within the letter, but there is also a great deal of misunderstanding.

Do pot smokers live in realms of make-believe? Do we care about nothing? Interestingly, we could just as easily ask the same question of those proscribed SSRI (selective serotonin reuptake inhibitor) antidepressants like Prozac, and all of the other legally sanctioned mind-altering substances. Leaving aside social acceptance, which surely owes much to the profit motive, what other distinction can we make here once we dismiss the false hypothesis of redressing chemical imbalance?

Of course, none of us ever knows what might otherwise have been had they not done such and such. The road not taken is forever unknown. The only fair question therefore must involve regret, and I confess that I do not regret my decision to smoke pot, nor do I know any friends who have told me they regret their own choice in this regard. The important point I wish to emphasise is that legal determinations do not automatically establish what is to our better health and well-being, and nor do they determine what is right and wrong in a moral sense. Indeed, who dares to tell another adult how they ought to think, and equally who dares to say how one may or may not alter their consciousness by whatever means they see fit? If we are not entirely free to think as we choose, then as creatures so fully submerged in our thoughts, we can hardly be said to free at all.


Here is David Cohen again, discussing how psychiatric drugs are no different in kind from many street drugs:

David Nutt, Professor of Neuropsychopharmacology at Imperial College London, has closely studied the range of harms that legal and illegal drugs can do to individuals and society. On the basis of his findings, he has reached the perhaps surprising conclusion that policy should begin with an end to the criminalisation of all drug use and possession. In March 2021 he was interviewed by Novara Media’s Ash Sarkar:

Comedian Bill Hicks also his own opinions on why some drugs are taxed when others are banned [warning: very strong language and opinions!]:


III      Driven crazy?

“[P]eople who experience themselves as automata, as robots, as bits of machinery, or even as animals… are rightly regarded as crazy. Yet why do we not regard a theory that seeks to transmute persons into automata or animals as equally crazy?”

— R. D. Laing 36


Type the words ‘mental health crisis’ into any search engine and you will find more than a million pages with links to reports from Australia, Canada, Europe and America all presenting stark evidence that the Western world is in the grip of what in other contexts would certainly be called a pandemic: a plague of disease that is horribly debilitating, too often fatal, and affecting nearly one in ten of our population: men and women, children and the old alike. According to the latest surveys in any given week in England, 1 in 6 people (15%) report experiencing some kind of mental health problem. In just twenty years (1993 to 2014) the number of people experiencing mental health problems went up by 20%, while the number reporting severe mental health symptoms in any given week has risen from 7% in 1993 to over 9% in 2014. 37 Indeed, this issue has now become such a grave one that it receives serious attention in political debates. Still more positively, ways to deal with it are today widely discussed, and the stigma associated with mental illness is at last aired and challenged across the mainstream. But one question very seldom addressed is this: what has generated so much suffering and distress in the first place? What is the cause of this now admitted mental health crisis?

Since the issue is obviously an extremely complex one, I propose that we break it down into three parts that can be abbreviated as three A’s: access, accountancy and aetiology. The most simplistic assumption we could make would be that our current crisis is a consequence of just one of these three factors. So, for instance, if the rise in case numbers is a purely matter of easier access to treatment, then it follows from our presumption that there is no underlying increase but that sufferers of mental health problems are simply more able and willing to seek professional help. If true then ‘the crisis’ has always existed but previously the greatest number simply suffered in silence.

Alternatively, we might presume that the rise is a perceived one and its origin is entirely due to changes in accountancy, in which instance states of mind that in the past were undifferentiated from the norm have gradually been medicalised as I have discussed above. Whereas improved access to care is a laudable good, by contrast, if accountancy is to blame, then society is increasingly in the business of treating the sane as if they were sick. Reclassifying normality as abnormality would mean psychiatry has helped create the illusion of an epidemic, although it is important to understand that it does not follow that the suffering itself is illusory, only that our tendency is to see that suffering as psychiatric in nature.

Alternatively again, we might instead conclude that the rise in cases is real and unrelated to either ease of access or what has been described as “the medicalisation of misery”. In this case, we are necessarily drawn into the matter of aetiology and must extend the investigation to search for underlying external causes – causes that to some degree can be found to account for a genuine rise in mental illness.

Certainly these aren’t mutually exclusive considerations, but are these three A’s exhaustive? Broadly considered yes, however, a breakdown of this kind has indistinct fuzzy edges and all that is certain is a combination, or potentially even a synergy, operates between the three. Indeed, given that mental health is expressly bound up with and unavoidably defined by feelings of wellness, no psychiatric diagnosis can ever be scientifically objective in the strictest sense. Setting aside therefore the matter of access to better healthcare, which all else being equal, is wholly positive, my considerations in the remainder of this chapter are to disentangle the other strands.

In one sense the mental health crisis is undeniably real. More and more people are suffering forms of psychological distress and in no way do I mean to suggest otherwise. There is an urgent need therefore to get to the bottom of what is causing this crisis.

Johann Hari is a journalist who spent many years investigating the causes of depression, the reasons why the West is seeing such a rise in incidence, and how we might find better alternatives to treat this epidemic. It isn’t caused by a chemical imbalance in our brains, he notes at the outset, but crucially by changes in the way we are living:


The evidence of a connection between what happens in childhood and the effects on later behaviour is very strong indeed. This is unsurprising of course. It is perhaps self-evident that mental illness grows out of trauma and hunger, which are the bitter fruits of abuse, neglect and abandonment, both physical and psychological. But to explain the ongoing rise (affecting adults as much as children) we would be hard pressed to attribute much cause to changes in parenting styles given how the rise is so steep with a 20% increase over just two decades – very definitely not if Philip Larkin is to be believed. 38

To be frank, parents have always “fucked you up”, as for that matter have our siblings, our peers, and undoubtedly, many of our fucked-up teachers. Of course, one significant change during recent decades is that parents spend more time working, thus leaving children with childminders or, if money is tight, with the keys to an empty house. Studies again unsurprisingly show that latchkey kids are more susceptible to behavioural problems.

A related issue affecting early development is the omnipresence of new technologies. Once the pacifier was television, but this single room distraction has been slowly superseded by the introduction of computer games, iphones, etc. There is a widespread dependency on these types of electronic devices, and so without any immediate control group, the psychological damage caused by habitually engaging in such virtual interactions will be extremely difficult to gauge.

Of course, television has been used as an infant pacifier ever since I can remember. No doubt it once pacified me too. But television itself has been radically transformed. It has become louder, brighter, more intense due to faster and slicker editing, and it is surely reasonable to presume, since the sole purpose is to grab attention and transfix its audience, more and more intoxicating. Viewing TV today is a fundamentally altered experience compared to viewing it decades ago. Could any of this be having a knock-on effect with regards to attention span, cognitive skills, or, more importantly, our sense of self? This is a highly complex issue that I shall not delve into here – in the addendum I do however consider the psychological and societal impacts of advertising (I also dedicate a later chapter to the role advertising plays in our society).

What is known for certain is this: that other than in exceptional instances when the origin of severe mental illness can be directly traced to an underlying physical disease (syphilis is perhaps the best known example), the usual trigger for mental health problems is found to be either sudden or prolonged trauma – very often although not exclusively childhood trauma – and the development of the vast majority of mental disorders occurs therefore as a pathological but defensive response to trauma.


Following Freud, causes of mental illness came to be thought buried deep within the patient’s unconscious. For this reason, Freud and the psychoanalysts pioneered their ‘talking cure’: conversational techniques that probed deep into the psyche. Various schools arose. They inquired into dreams, biography, sexuality, family relations or even spirituality, feeling down for the roots of their patent’s distress. With the psychical wound discovered, it might now be cleansed and disinfected by means of further introspection. Healing came about as nature then took its course. Here the patient plays a central role in their own treatment.

R. D. Laing dignified his patients in another way. Refraining from excessive presumptions built on the unsteady and evolving theories of the unconscious – the Oedipal Complex, Penis Envy, and other fabulous chimera detected by Freud and his followers – Laing gave his patients the common respect the rest of us outside the padded walls of the asylum receive from our peers. No matter how superficially crazy, he adjudged every patient’s account of his or her lived experience as entirely valid in the existential sense as he would the truthful account of any sane human being, including his own. This exceedingly hazardous (some might say reckless) approach to a patent’s illness did, however, produce remarkable outcomes – at least to begin with – as many of those he treated were speedily recovered and declared fit enough to return home.

However, Laing’s successes seldom lasted long, and predictably within a just few months, more than half would drift back into his care. Witnessing this cyclical pattern of decline had an interesting effect on Laing, for it caused him to reach a new and shocking conclusion. With no less conviction than before, he let it be known that social relationships, and especially ones within families, were the major triggers of his patients’ relapse. This was an audacious diagnosis which, unsurprisingly, met with general hostility, as the accused – not only the families but society as a whole – felt immediately affronted by the charge that they were fons et origo of the patient’s sickness.

Undaunted, Laing took his ideas to their logical extreme. He allowed his patients to play out their madness to the full, believing that for a lasting cure the condition must be allowed to run its course – and who can honestly say if and when madness is fully cured? Unconstrained by the boundaries of orthodox medicine, Laing and his fellow therapists would enter perilously into the worlds of their patients. Laing himself, by all accounts, went somewhat bonkers in the process, which is hardly surprising, since whatever madness is, it is most certainly contagious (and after all, this in a nutshell is really Laing’s central point). 39

As his conduct became morally questionable – sexual affairs with his patients creating troubles within his own family – his professional reputation was understandably tarnished and alongside this reputational decline, his ideas went out of fashion. In spite of this, Laing’s legacy persists in important ways. The more dignified respect for sufferers of mental illness (who even today are sadly denied full human rights equivalence) owes a great deal to Laing’s daring intellectual courage and integrity. On the other hand, the true and lasting value of Laing’s work has been both forgotten and dismissed. For when he tells us that insanity is “a perfectly rational adjustment to an insane world” 40, then given the rise of today’s ‘mental health crisis’, our mental health professionals and society more broadly needs to listen up.

In a world that’s ever slicker, faster, and as human contact becomes more distant and superficial, increasingly artificial indeed, the modern self (perhaps that should read ‘postmodern’) becomes more atomised and systematised than in Laing’s time (Laing died three decades ago). Cajoled to sacrifice ever more individuality for the sake of conformity, convenience, security and status; our given raison d’etre is to engorge our material well-being, either for its own pleasure or, more egotistically, with shows of conspicuous consumption. We are, as T.S. Eliot put it so elegantly, “distracted from distraction by distraction/ filled with fancies and empty of meaning”. 41


“The normal process of life contains moments as bad as any of those which insane melancholy is filled with, moments in which radical evil gets its innings and takes its solid turn. The lunatic’s visions of horror are all drawn from the material of daily fact. Our civilization is founded on the shambles, and every individual existence goes out in a lonely spasm of helpless agony.” 42

These are the grim observations of William James, another pioneer of the field of psychology, who is here trying to get to grips with “the unpleasant task of hearing what the sick souls, as we may call them in contrast to the healthy-minded, have to say of the secrets of their prison-house, their own peculiar form of consciousness”. James’ vocabulary is remarkably direct and unambiguous, so allow me to very briefly skim the thesis of what he saw as the underlying cause of madness, sticking closely to his original terminology wherever possible.

Their “morbid-minded way”, James reluctantly concedes, should not be too readily dismissed. “With their grubbing in rat-holes instead of living in the light; with their manufacture of fears, and preoccupation with every unwholesome kind of misery…” it may appear to the “healthy-minded” as “unmanly and diseased”, but, on the other hand, “living simply in the light of good”, although “splendid as long as it will work”, involves us in a partial denial of reality which “breaks down impotently as soon as melancholy comes”. Furthermore, says James:

“… there is no doubt that healthy-mindedness is inadequate as a philosophical doctrine, because the evil facts which it refuses positively to account for are a genuine portion of reality; and they may after all be the best key to life’s significance, and possibly the only openers of our eyes to the deepest levels of truth.”

With the advent of modern comforts and our immersive condition of historically unprecedented safety and security it can appear that those of born in the wealthiest regions of the world have little reason to grumble, certainly when compared to the conditions of previous generations. Indeed for anyone in Britain born into the working class or above, the famous words of Tory Prime Minister Harold Macmillan that “we’ve never had it so good” do mostly still apply. Studies have shown, of course, that social equality is far more closely correlated to overall levels of happiness than absolute levels of wealth 43, but no less apparent is the more straightforward fact that having become materially satisfied, what we might call ‘psychological immiseration’ is more widespread than ever.

With material wants met we are left to tread a vertiginous tightrope that has been called ‘happychondria’: that perpetual and single-minded pursuit of happiness per se that makes us achingly self-aware of shortcomings in this narrow regard. And feelings of an ‘unbearable lightness of being’ become all the lighter once our striving to be happy burgeons into an all-consuming monomaniacal fixation, since happiness is insufficient to ground us and make us feel real. Worse still, as James explains, perpetual happiness is absolutely unattainable due to the inevitable travails of life, and given most people’s tangential urge to negotiate life’s experiences authentically. Or putting matters the other way around, since most people inevitably fail to attain the levels of happiness socially demanded, such non-stop pursuit of happiness (and by ‘pursuit’ here I mean ‘chasing’ rather than ‘activity’ or ‘recreation’ 44) inevitably will have adverse effects and very likely result in neurosis and feelings of moroseness. The etymological root of our word ‘happiness’ is revealing in this regard: ‘hap’ meaning luck or good fortune. Dormant in the language a vestigial memory that happiness is a gift bestowed, rather than a treasure seized.


Unable to function for long or to endure everyday states of consciousness, a growing number of people are now turning either to legally prohibited narcotics or proscribed and medically endorsed opiates: drugs that lift the clouds of emptiness, or else, numb the user to the tawdriness of everyday reality. These pills offer a surrogate escape when it can no longer be supplied by the local shopping mall, or, and always more persuasively, by TV and similar distractions online – both places where our big pharmaceutical companies go to enhance their profits by continuously pushing more of their psychoactive wares.

To a great extent, these powerful industries, whether through lobbying or via alternative means of self-promotion, have gradually reshaped psychiatry itself. The patient who was once central to their own treatment has been made peripheral once again, as the psychiatrist gets on with mending their mental apparatus. And by ‘mending’ it is better to read ‘made happier’, or else, ‘made normal’, and thus subjected to a transformation which is centred on societal functioning, but that may or may not be life enhancing in a fuller and more meaningful sense. So does it finally matter if society becomes ‘happier’ and people are better able to cope due only to a widespread use of pharmaceuticals? And does it matter if children as young as six are proscribed a daily dose of mind-altering drugs just to fit in and get by? 45

What if anguish and sorrow are vital parts to an authentic experience of life, and, as a good friend and poet once put it: “woe is part of the worship”? To rebut sorrow and utterly disregard the origins of distress seems to me irredeemably Panglossian, which is surely no less life-denying than its counter opposite, a fatalistic surrender to misery. Because to be able truly to affirm in capitals – to say “YES” – is finally predicated on our capability to no less defiantly scream “NO”! In the finish it is zombies alone that are unable ever to scream “NO!” and especially once confronted by the reoccurring cruelties and stupidities of this sometimes monstrous world.

Fritjof Capra says that Laing once told him, “Mystics and schizophrenics find themselves in the same ocean, but the mystics swim whereas the schizophrenics drown.” And latent within even the most zombified of people, there must linger, no matter how faintly, an inextinguishable inner presence akin to spirit, to soul; a living force that cannot be fully disabled without untold consequences. It is this inner life that fights on and kicks against the main object it can kick against: those modes of thinking and behaving that the ‘normal world’ sanctions and calls ‘sane’, but which the organism (sometimes correctly) identifies as aspects of an inexplicable, incomprehensible and literally terrifying existential threat.

This is how Laing understood the nature of madness, and Laing was one of the sanest (both under legal and more popular definitions) ever to have stayed so close to its shadow. He studied the mad without ever flinching away; listening on with patient compassion to their plight. He stayed open and survived. In an important sense, he trusted their testimony. If we wish to understand what is happening to us, I believe we ought to trust just one of his findings too. As Laing concludes in the same preface to his book The Divided Self:

“Thus I would wish to emphasize that our ‘normal’ ‘adjusted’ state is too often the abdication of ecstasy, the betrayal of our true potentialities, that many of us are only too successful in acquiring a false self to adapt to false realities” 46

While on another occasion he wrote still more emphatically:

“From the alienated starting point of our pseudo-sanity, everything is equivocal. Our sanity is not ‘true’ sanity. Their madness is not ‘true’ madness. The madness of our patients is an artefact of the destruction wreaked on them by us and by them on themselves. Let no one suppose that we meet ‘true’ madness any more than that we are truly sane. The madness that we encounter in ‘patients’ is a gross travesty, a mockery, a grotesque caricature of what the natural healing of that estranged integration we call sanity might be. True sanity entails in one way or another the dissolution of the normal ego, that false self competently adjusted to our alienated social reality; the emergence of the ‘inner’ archetypal mediators of divine power, and through this death a rebirth, and the eventual reestablishment of a new kind of ego-functioning, the ego now being the servant of the divine, no longer its betrayer.” 47

As with death per se, we choose on the whole to remain oblivious to our all-embracing deathly materialist existence, excepting a dwindling minority who our secular society marginalise as deluded and misguided at best, and at worst cranks or fanatics – and there are many religious cranks and fanatics, of course, just as there are no less fanatical anti-religious zealots. Perhaps, to paraphrase Philip Larkin, the rest of us really ought to be screaming. Whether stultified or petrified, inwardly, many are, and that’s where the pills come in.

Laing did not mistake madness for normality, but understood perfectly well that normality can often be madness too. And normality, in turn, after being exposed as madness, has deliberately misunderstood Laing ever since.

Next chapter…


Addendum: Advertising vs. sanity

The following brief extract is drawn from an article by satirist Hugh Iglarsh based around an interview with activist and award-winning documentary filmmaker Jean Kilbourne that was published in Counterpunch magazine in October 2020.

HI: What kind of personality does advertising cultivate? How would you describe the ideal consumer or recipient of advertising?

JD: The ideal ad watcher or reader is someone who’s anxious and feels incomplete. Addicts are great consumers because they feel empty and want to believe that something out there, something for sale, can fill them up. Perhaps the ideal consumer is someone suffering from bulimia, because this person will binge and gorge and then purge, thus needing to start the cycle all over again.

HI: Addiction is one of the major themes of your book. How does advertising help foster addiction?

JD: The selling of addictive products is of course a big part of what advertisers do. They study addiction very closely, and they know how addicts think – they literally know what lights up their brains.

Advertisers understand that it is often disconnection in childhood that primes people for addiction. For many traumatized people, the first time they drink or smoke or take drugs may be the very first time they feel okay. Soon they feel they are in a relationship with the alcohol or the cigarette. Addicts aren’t stupid – the stuff they’re taking really does work, at least at first. It numbs the pain, which makes them feel connected to the substance. Eventually the drug or substance turns on them and makes all the problems they’re fleeing from worse.

What struck me about the genius of advertisers is how they exploit themes of tremendous importance to addicts, such as their fear of loneliness and desire for freedom. This is precisely what addiction does to you – it seems to offer you what you need, while actually making you more dependent, more alone. The ads promise freedom and connection, in the form of products that entrap users and weaken relationships. 48

In Chapter Eight, The Unreal Thing, I present my own thoughts on the detrimental impact of advertising on modern culture.


Please note that for the purposes of ‘publishing’ here I have taken advantage of the option to incorporate hypertext links and embed videos – in order to distinguish additional commentary from the original text all newly incorporated text has been italised.


1 Extract from The Divided Self: An Existential Study in Sanity and Madness by R. D. Laing, first published 1959/60; “Preface to the Pelican Edition” written September 1964.

2 From an article entitled “Asylum tourism” by Jennifer L. Bazar and Jeremy T. Burman, published in Monitor on Psychology, February 2014, Vol 45, No. 2.

3 Sometimes quoted in Latin as Quos Deus vult perdere, prius dementat (literally: Those whom God wishes to destroy, he first deprives of reason) or Quem Iuppiter vult perdere, dementat prius (literally: Those whom Jupiter wishes to destroy, he first deprives of reason) this expression has been used in English literature since at least the 17th century. In the form presented here it first appeared in the Reverend William Anderson Scott’s book Daniel, a Model for Young Men and then later in Longfellow’s poem The Masque of Pandora. Although falsely attributed to Euripides, earlier versions of this phrase do indeed have classical Greek origins.

4 The shift in attitude towards sexual practices as extreme as sadomasochism is a curious one. I take the liberal view that it is right to be fully tolerant of activities that do not injure innocent parties and so do not wish to infringe individual freedoms when they do not violate the freedom of others. Nevertheless, I tend to regard sexual practices such as sadomasochism as perverse, and not because I do not understand them, but because I do. I recognise the urge that twists pleasure and pain together; the same one that mixes up vulnerability with humiliation. The psychological dangers are abundantly clear to me and the fact that our society today actively promotes and normalises S/M is perhaps indicative of a traumatic breakdown in human relations.  It is wonderful that society has overcome so many of its hang-ups, but all taboos aren’t equal. Taboos against inflicting severe pain, even when consensual, do make sense.

Sarah Byrden, a sex educator and sacred sexuality teacher, says we are simultaneously (without realising it) “being bounced off the walls between pornography and Puritanism”:

5    Salvador Dalí is certainly attributed with a quote along these lines.


After calling the hospital for an appointment, the pseudopatient arrived at the admissions office complaining that he had been hearing voices. Asked what the voices said, he replied that they were often unclear, but as far as he could tell they said “empty,” “hollow,” and “thud.” The voices were unfamiliar and were of the same sex as the pseudopatient. The choice of these symptoms was occasioned by their apparent similarity to existential symptoms. Such symptoms are alleged to arise from painful concerns about the perceived meaninglessness of one’s life. It is as if the hallucinating person were saying, “My life is empty and hollow.” The choice of these symptoms was also determined by the absence of a single report of existential psychoses in the literature.

Beyond alleging the symptoms and falsifying name, vocation, and employment, no further alterations of person, history, or circumstances were made. The significant events of the pseudopatient’s life history were presented as they had actually occurred. Relationships with parents and siblings, with spouse and children, with people at work and in school, consistent with the aforementioned exceptions, were described as they were or had been. Frustrations and upsets were described along with joys and satisfactions. These facts are important to remember. If anything, they strongly biased the subsequent results in favor of detecting insanity, since none of their histories or current behaviors were seriously pathological in any way.

Immediately upon admission to the psychiatric ward, the pseudopatient ceased simulating any symptoms of abnormality. In some cases, there was a brief period of mild nervousness and anxiety, since none of the pseudopatients really believed that they would be admitted so easily. Indeed, their shared fear was that they would be immediately exposed as frauds and greatly embarrassed. Moreover, many of them had never visited a psychiatric ward; even those who had, nevertheless had some genuine fears about what might happen to them. Their nervousness, then, was quite appropriate to the novelty of the hospital setting, and it abated rapidly.

Apart from that short-lived nervousness, the pseudopatient behaved on the ward as he “normally” behaved. The pseudopatient spoke to patients and staff as he might ordinarily. Because there is uncommonly little to do on a psychiatric ward, he attempted to engage others in conversation. When asked by staff how he was feeling, he indicated that he was fine, that he no longer experienced symptoms. He responded to instructions from attendants, to calls for medication (which was not swallowed), and to dining-hall instructions. Beyond such activities as were available to him on the admissions ward, he spent his time writing down his observations about the ward, its patients, and the staff. Initially these notes were written “secretly,” but as it soon became clear that no one much cared, they were subsequently written on standard tablets of paper in such public places as the dayroom. No secret was made of these activities.

The pseudopatient, very much as a true psychiatric patient, entered a hospital with no foreknowledge of when he would be discharged. Each was told that he would have to get out by his own devices, essentially by convincing the staff that he was sane. The psychological stresses associated with hospitalization were considerable, and all but one of the pseudopatients desired to be discharged almost immediately after being admitted. They were, therefore, motivated not only to behave sanely, but to be paragons of cooperation. That their behavior was in no way disruptive is confirmed by nursing reports, which have been obtained on most of the patients. These reports uniformly indicate that the patients were “friendly,” “cooperative,” and “exhibited no abnormal indications.”

Extract taken from Rosenhan DL (January 1973) entitled “On being sane in insane places” published in Science 179 (4070): 250–8.

7    Ibid.

8    Ibid.


A psychiatric label has a life and an influence of its own. Once the impression has been formed that the patient is schizophrenic, the expectation is that he will continue to be schizophrenic. When a sufficient amount of time has passed, during which the patient has done nothing bizarre, he is considered to be in remission and available for discharge. But the label endures beyond discharge, with the unconfirmed expectation that he will behave as a schizophrenic again. Such labels, conferred by mental health professionals, are as influential on the patient as they are on his relatives and friends, and it should not surprise anyone that the diagnosis acts on all of them as a self-fulfilling prophecy. Eventually, the patient himself accepts the diagnosis, with all of its surplus meanings and expectations, and behaves accordingly. [Ibid.]

10  Ibid.

11 Physicists – at least all the one I’ve known – whether they’ve heard it before or not (and they generally have heard it before), get the joke immediately; non-physicists, on the other hand, I refer to the old saw that “many a true word is spoken in jest.” For such blunt reductionism certainly does lie at the heart of physics, and indeed of all ‘hard science’; disciplines that are founded upon the simplification of the infinitely complex processes of the natural world. With its especial penchant for ‘elegance’ and parsimoniousness, every physicist is trained through repeated worked examples, and eventually hard-wired to consider the most straightforward and ideal case as the most productive first step in solving every problem: hence the spherical cow. The funny thing is, how often it works!

Consider a Spherical Cow became the title of a book about methods of problem solving using simplified models written by Environmental Scientist John Harte, published in 1988.

In a letter to Science journal published in 1973 the author Steven D. Stellman instead postulated “A Spherical Chicken”.

12 The fact that no-one is actually able to answer this question says a lot about time machines – but that’s for a separate discussion!

13    From the essay Night Walks written by Charles Dickens, originally published in the weekly journal All Year Round in 1859, and appearing as Chapter 13 of The Uncommercial  Traveller (1861).

14 From Aldous Huxley’s Brave New World Revisited (1958), chapter 8 “Chemical Persuasion”

15 From Oliver Sack’s A Leg to Stand On (1984), chapter VII “Understanding”

16 From an interview in The Observer published January 25, 1931.

17 In 1951, Solomon Asch conducted his first conformity laboratory experiments inviting groups of male college students to participate in a simple “perceptual” task, which involved distinguishing between three lines labelled A,B and C to decide which matched the length of another comparator line on a different card. In reality, all but one of the participants was an actor, and the true focus of the study was how the remaining participant would react to the actors’ behaviour. Each participant was asked in turn to say aloud which line matched the length of that on the first card and seated such that the real participant always responded last.

In the control group, with no pressure to conform to actors, the error rate on the critical stimuli was less than 1%. In the actor condition also, the majority of participants’ responses remained correct (63.2%), but a sizable minority of responses conformed to the actors’ (incorrect) answer (36.8 percent). The responses revealed strong individual differences: 5% of participants were always swayed by the crowd and only 25% consistently defied majority opinion; the rest conforming on some trials. Overall, 75% of participants gave at least one incorrect answer out of the 12 critical trials. In his opinion regarding the study results, Asch put it this way: “That intelligent, well-meaning, young people are willing to call white black is a matter of concern.”

18 This is sometimes called ‘Planck’s Principle’ and it is taken from the following passages drawn from Wissenschaftliche Selbstbiographie. Mit einem Bildnis und der von Max von Laue gehaltenen Traueransprache. [trans: Scientific Autobiography. With preface and obituary by Max von Laue] Johann Ambrosius Barth Verlag (Leipzig 1948), p. 22, in Scientific Autobiography and Other Papers, (1949), as translated by F. Gaynor, pp. 33–34, 97.

“A new scientific truth does not generally triumph by persuading its opponents and getting them to admit their errors, but rather by its opponents gradually dying out and giving way to a new generation that is raised on it. … An important scientific innovation rarely makes its way by gradually winning over and converting its opponents: it rarely happens that Saul becomes Paul. What does happen is that its opponents gradually die out, and that the growing generation is familiarized with the ideas from the beginning: another instance of the fact that the future lies with the youth.”

19 William James, Principles of Psychology volume I. chapter vii. p. 196, 1890.

20    Transforming Diagnosis, a post by former National Institute of Mental Health (NIMH) Director Thomas Insel, published by NIMH on April 29, 2013.

21  The film (released 1975) was the adaptation of a novel of the same name written by Ken Kesey and published more than a decade earlier in 1962. Kesey based his story on experiences he had had working late shifts as an orderly at a mental health institution, as well as more personal experiences of using psychodelics.

22 Quote taken from Cracked: Why Psychiatry is Doing More Harm Than Good (2012) by James Davies, Chapter 2, “The DSM – a great work of fiction?”

23 Ibid.

24 Ibid.

25 Ibid.

26 Ibid.

27 Ibid.

28 Ibid.

29 From an article entitled “Diagnosing the D.S.M.” written by Allen Francis, published in The New York Times on May 11, 2012.

30 From an article entitled “Inside The Battle To Define Mental Illness” written by Gary Greenberg, published in Wired magazine on December 27, 2010.

31 Although the practice continued in France into the 1980s, whereas, perhaps surprisingly, it had been banned already on moral grounds by 1950 in the Soviet Union.


The Montreal Experiments were carried out on patients suffering from schizophrenia that used sensory deprivation, ECT and drugs (included drug induced coma) combined with “psychic driving” which was an early form of brainwashing involving pre-recorded audio tapes played non-stop for days with up to half a million repetitions altogether. One of Cameron’s victims was Jean Steel, whose daughter Alison (only four and a half at the time of her mother’s treatment) told CBC News in an interview:

“She was never able to really function as a healthy human being because of what they did to her.”

From an article entitled “Federal government quietly compensates daughter of brainwashing experiments victim” written by Elizabeth Thompson, published by CBC News on October 26, 2017.

Embedded below is an episode from CBC investigative documentary series The Fifth Estate entitled “MK Ultra: CIA mind control program in Canada” that was first broadcast in 1980:

33 An article titled “Rorschach and Awe” written by Katherine Eban, published in Vanity Fair in July 2007 reported that:

A psychologist named Jean Maria Arrigo came to see me with a disturbing claim about the American Psychological Association, her profession’s 148,000-member trade group. Arrigo had sat on a specially convened A.P.A. task force that, in July 2005, had ruled that psychologists could assist in military interrogations, despite angry objections from many in the profession. […]

Two psychologists in particular played a central role: James Elmer Mitchell, who was attached to the C.I.A. team that eventually arrived in Thailand, and his colleague Bruce Jessen. Neither served on the task force or are A.P.A. members. Both worked in a classified military training program known as SERE—for Survival, Evasion, Resistance, Escape—which trains soldiers to endure captivity in enemy hands. Mitchell and Jessen reverse-engineered the tactics inflicted on SERE trainees for use on detainees in the global war on terror, according to psychologists and others with direct knowledge of their activities. The C.I.A. put them in charge of training interrogators in the brutal techniques, including “waterboarding,” at its network of “black sites.” In a statement, Mitchell and Jessen said, “We are proud of the work we have done for our country.”

An article titled “The Black Sites” written by Jane Mayer, published in The New Yorker in August 2007 picked up the same story:

The use of psychologists [on the SERE program] was also considered a way for C.I.A. officials to skirt measures such as the Convention Against Torture. The former adviser to the intelligence community said, “Clearly, some senior people felt they needed a theory to justify what they were doing. You can’t just say, ‘We want to do what Egypt’s doing.’ When the lawyers asked what their basis was, they could say, ‘We have Ph.D.s who have these theories.’ ” He said that, inside the C.I.A., where a number of scientists work, there was strong internal opposition to the new techniques. “Behavioral scientists said, ‘Don’t even think about this!’ They thought officers could be prosecuted.”

Nevertheless, the SERE experts’ theories were apparently put into practice with Zubaydah’s interrogation. Zubaydah told the Red Cross that he was not only waterboarded, as has been previously reported; he was also kept for a prolonged period in a cage, known as a “dog box,” which was so small that he could not stand. According to an eyewitness, one psychologist advising on the treatment of Zubaydah, James Mitchell, argued that he needed to be reduced to a state of “learned helplessness.” (Mitchell disputes this characterization.)

A subsequent Senate Intelligence Committee report from 2014 confirms that:

The CIA used two outside contract psychologists to develop, operate, and assess its interrogation operations. The psychologists’ prior experience was at the Air Force Survival, Evasion, Resistance and Escape (SERE) school. […]

The contractors developed the list of enhanced interrogation techniques and personally conducted interrogations of some of the CIA’s most significant detainees using those techniques. The contractors also evaluated whether detainees’ psychological state allowed for the continued use of the techniques, even for some detainees they themselves were interrogating or had interrogated. […]

In 2005, the psychologists formed a company to expand their work with the CIA. Shortly thereafter, the CIA outsourced virtually all aspects of the program. The CIA paid the company more than $80 million.


“The discovery of phenothiazines, the first family of antipsychotic agents has its origin in the development of the German dye industry, at the end of the 19th century (Graebe, Liebermann, Bernthsen). Up to 1940 they were employed as antiseptics, antihelminthics and antimalarials (Ehrlich, Schulemann, Gilman). Finally, in the context of research on antihistaminic substances in France after World War II (Bovet, Halpern, Ducrot) the chlorpromazine was synthesized at Rhône-Poulenc Laboratories (Charpentier, Courvoisier, Koetschet) in December 1950. Its introduction in anaesthesiology, in the antishock area (lytic cocktails) and “artificial hibernation” techniques, is reviewed (Laborit), and its further psychiatric clinical introduction in 1952..”

From the abstract to a paper entitled “History of the Discovery and Clinical Introduction of Chlorpromazine” authored by Francisco Lopez-Muñoz, Cecilio Alamo, Eduardo Cuenca, Winston W. Shen, Patrick Clervoy and Gabriel Rubio, published in the Annals of Clinical Psychiatry, 17(3):113–135, 2005.

35 Psychiatry’s New Brain-Mind and the Legend of the “Chemical Imbalance” written by Ronald W. Pies, Editor-in Chief Emeritus and published by Psychiatric Times on July 11, 2011.

36 Extract from The Divided Self: An Existential Study in Sanity and Madness by R. D. Laing, first published 1959/60; Part 1, Chapter 1, “The Existential-Phenomenological Foundations for A Science of Persons”.

37 McManus S, Bebbington P, Jenkins R, Brugha T. (eds.) (2016). Mental health and wellbeing in England: Adult psychiatric morbidity survey 2014.

38 Larkin’s celebrated poem This be the Verse which begins with the lines “They fuck you up, your Mum and Dad/ They may not mean to, but they do” was written and first published in 1971.

39 One of Laing’s great interests was in the “double bind” situation, which he came to diagnose as the root cause for most of the madness around him. Laing had adopted the idea of the “double bind” from anthropologist Gregory Bateson. Bateson, in turn, had traced the notion back to a semi-autobiographical novel by Victorian Samuel Butler, entitled The Way of All Flesh. But Butler had only described the condition and not named it, whereas Bateson had rediscovered it and labelled it as an important cause of schizophrenia.

Hearing from a parent, for instance, that “I love you” whilst seeing no expression which supported the evidence of that expressed love, presented the patient with a “double-bind” situation. This is just one example, but Laing had witnessed this and many other kinds of “paradoxical communication” in his patients’ relationships to their nearest and dearest. He eventually came to believe, along with Bateson, that being caught in such a “double-bind” situation was existentially damaging and very commonly, therefore, psychologically crippling. In recognising this, Laing had undoubtedly discovered a fragment of the truth, and it is a shame that he then over-intellectualises the issue, as intellectuals are wont to do. Replace “double bind” with “mind game” and his case becomes much clearer. If people, especially those you are closest to you and those you need to trust, constantly undermine your view of yourself and of your relationship to others, then the seeds of destruction are being sown. But to my mind, such details of Laing’s outlook are nothing like as interesting and illuminating as the general thrust of what he had to say about our society.

40 As quoted in Wisdom for the Soul: Five Millennia of Prescriptions for Spiritual Healing (2006) by Larry Chang, p. 412; this might be a paraphrase, as the earliest occurrence of this phrase thus far located is in the form: “Ronald David Laing has shocked many people when he suggested in 1972 that insanity can be a perfectly rational adjustment to an insane world.” in Studii de literatură română i comparată (1984), by The Faculty of Philology-History at Universitatea din Timioara. A clear citation to Laing’s own work has not yet been found.

41 From the first of T.S. Eliot’s Four Quartets titled Burnt Norton.

42 This passage continues:

“If you protest, my friend, wait till you arrive there yourself! To believe in the carnivorous reptiles of geologic times is hard for our imagination—they seem too much like mere museum specimens. Yet there is no tooth in any one of those museum-skulls that did not daily through long years of the foretime hold fast to the body struggling in despair of some fated living victim. Forms of horror just as dreadful to their victims, if on a smaller spatial scale, fill the world about us to-day. Here on our very hearths and in our gardens the infernal cat plays with the panting mouse, or holds the hot bird fluttering in her jaws. Crocodiles and rattlesnakes and pythons are at this moment vessels of life as real as we are; their loathsome existence fills every minute of every day that drags its length along; and whenever they or other wild beasts clutch their living prey, the deadly horror which an agitated melancholiac feels is the literally right reaction on the situation.”

Extract taken from The varieties of religious experience: study in human nature, Lectures VI and VII, “The Sick Soul”, William James (1902)

43 In their 2009 book The Spirit Level: Why More Equal Societies Almost Always Do Better authors Richard G. Wilkinson and Kate Pickett examined the major impact that inequality has on eleven different health and social problems: physical health, mental health, drug abuse, education, imprisonment, obesity, social mobility, trust and community life, violence, teenage pregnancies, and child well-being. The related Equality Trust website that was co-founded by the authors also includes scatter plots from their book. One of these shows a remarkably close correlation between prevalence of mental illness and income inequality with the following explanatory notes attached:

“Until recently it was hard to compare levels of mental illness between different countries because nobody had collected strictly comparable data, but recently the World Health Organisation has established world mental health surveys that are starting to provide data. They show that different societies have very different levels of mental illness. In some countries only 5 or 10% of the adult population has suffered from any mental illness in the past year, but in the USA more than 25% have.

“We first showed a relationship between mental illness and income inequality in eight developed countries with WHO data – the USA, France, Netherlands, Belgium, Spain, Germany, Italy, and Japan. Since then we’ve been able to add data for New Zealand and for some other countries whose surveys of mental illness, although not strictly comparable, use very similar methods – Australia, the UK and Canada. As the graph [above] shows, mental illness is much more common in more unequal countries. Among these countries, mental illness is also more common in the richer ones.”

More Information

Pickett KE, James OW, Wilkinson RG. Income inequality and the prevalence of mental illness: a preliminary international analysis. Journal of Epidemiology and Community Health 2006;60(7):646-7.

Wilkinson RG, Pickett KE. The problems of relative deprivation: why some societies do better than others. Social Science and Medicine 2007; 65: 1965-78.

James O. Affluenza, London: Vermilion, 2007.

Friedli L. Mental health, resilience and inequalities: how individuals and communities are affected, World Health Organisation. 2009.

Wilkinson RG, Pickett KE. The Spirit Level. Penguin. 2009. Buy the book from Amazon.

The notes and graph are also available by following the link:

44 A distinction I owe to American Archetypal Psychologist and former Director of Studies the C.G. Jung Institute in Zurich, James Hillman.

45 The facts speak for themselves really. For instance, a 2011 report from Centers for Disease Control and Prevention (CDC) reveals that in just ten years antidepressant use in the US has increased by a staggering 400%.

The report reveals that more than one in ten of the American population aged 12 or over is taking antidepressants. But that’s okay, according to “the authors of the report” because: “… many people who could benefit from antidepressants aren’t taking them. Only a third of people with symptoms of severe depression take antidepressants.”

The same report also reveals how a further 8% of Americans without depressive symptoms take the drugs for other reasons such as anxiety. And what about the population below 12 years old? Well, the following is taken from a report on what’s happening closer to home, published by the Guardian in March 2011 and which begins:

“Children as young a four are being given Ritalin-style medication for behavioural problems in breach of NHS guidelines.”

According to official UK guidelines, children over the age of six can now be prescribed with mind-altering substances and even when these are to be administered on a daily basis.

46 Extract from The Divided Self: An Existential Study in Sanity and Madness by R. D. Laing, first published 1959/60; “Preface to the Pelican Edition” written September 1964. Laing adds: “But let it stand. This was the work of a young man. If I am older, I am now also younger.”

47 R. D. Laing, The Politics of Experience  (Ballantine Books, N.Y., 1967)

48 From an article entitled “Advertising vs. Democracy: An Interview with Jean Kilbourne” written by Hugh Iglarsh, published in Counterpunch magazine on October 23rd 2020.


Filed under « finishing the rat race », Uncategorized

roll up the red carpet!

The following article is Chapter Five of a book entitled Finishing The Rat Race which I am posting chapter by chapter throughout this year and beyond. Since blog posts are stacked in a reverse time sequence (always with the latest at the top), I have decided that the best approach is to post the chapters in reverse order.

All previously uploaded chapters are available (in sequence) by following the link above or from category link in the main menu, where you will also find a brief introductory article about the book itself and why I started writing it.


“All animals are equal
but some animals are more equal than others

— George Orwell 1


I discovered recently and by happy accident that the author, Michael Young, who invented the term ‘meritocracy’, detested his own creation. Here’s how Young outlined his position in a Guardian article “Down with meritocracy”, published in 2001:

I have been sadly disappointed by my 1958 book, The Rise of the Meritocracy. I coined a word which has gone into general circulation, especially in the United States, and most recently found a prominent place in the speeches of Mr Blair.

The book was a satire meant to be a warning (which needless to say has not been heeded) against what might happen to Britain between 1958 and the imagined final revolt against the meritocracy in 2033.2

But I shall save further thoughts of Michael Young until later, and begin here by considering what lies in the shadows of a meritocracy. After all, and at first glance, what on earth can be wrong with the purposeful restructuring of society in ways that prioritise ‘merit’ above all else? Isn’t this the epitome of a fair system?

As with examining most ideas, it is helpful first to step back a little to gain perspective. In this case, it is important to get a fuller grasp of what ‘merit’ means when buried within the heart of ‘meritocracy’. What does ‘merit’, in this narrow political sense, finally equate to?

Throughout the last two hundred and more years, including under progressive administrations such as Clement Attlee’s reforming government in Britain and FDR’s earlier New Deal for America, the political systems in the West have remained very solidly rooted in capitalism, and being so, they have remained inherently utilitarian in design. It follows that ‘merit’ (in our narrow definitional sense) must be gauged on the scales of those extant utilitarian-capitalist conventions: that ‘merit’ therefore becomes an adjunct of ‘utility’ or, in other words, ‘usefulness’.

Advocates of capitalism like to evoke the invisible hand of the market, which they say enhances productivity and safeguards against wanton overproduction, thereby ensuring society’s needs are met. Thanks to the market that which is wasteful falls away, and in consequence profits and earnings will flow to the most efficient producers. So it follows that within a meritocracy governed strictly by market forces, with the invisible hand steering our efforts unerringly toward ‘usefulness’, estimations of ‘merit’ ought to be fairly directly measureable in terms of salaries and wealth. Maximum profits and earnings tending to go to those who serve the most useful function and are, by dint of this, the most ‘merited’. The losers are those who merit little since they provide little to nothing of use, and, conversely, the winners contribute most gainfully in every sense…

There is already a suffocating tightness in this loop; a circularity that brings me to consider the first serious objection against meritocracy, if only the most trivial and conspicuous. For judged solely by its own terms just how meritocratic is our celebrated meritocracy? Hmmm – need I go on? Very well then, I shall offer this brisk reductio ad absurdum:

Let’s start where the debate ordinarily ends, with the topic of professional footballers… To most people, the excessive salaries paid to footballers stands out as an egregious example of unfairness. I share the same view, but wonder why we stop at footballers. They are not alone; not by a long chalk.

Indeed, given that our utilitarian-capitalist meritocracy does in fact function as it is presumed to function, then it follows that most top sportsmen (to a lesser extent, sportswomen too), including footballers, but also tennis players, golfers, F1 drivers, cyclists, athletes, etc – sports of low popularity by comparison – as well as pop idols, TV celebrities and film stars (not forgetting agents and the retinue of hangers-on) are, by virtue of their fabulous incomes, not merely most deserving of such high rewards, but also, by direct extension, some of the most ‘productive’ amongst us. Would any deign to defend this high visibility flaw in our socio-economic system? Truth is that many on this ever-expanding list are rewarded for just one thing: fame – thanks to another self-perpetuating cycle in which fame makes you wealthy, and then wealth makes you more famous again.

Nor does such rightful utilitarian calculus reliably account for the gargantuan salaries and bonuses (and who else gets bonuses in excess of their salaries!) of so many bankers, hedge fund managers and other financiers who callously wrecked our western economies. With annual remuneration that outstrips most ordinary worker’s lifetime earnings, the staggering rewards heaped upon those working in The City and Wall Street have little relationship to levels of productivity and usefulness, but worse, remuneration is evidently disconnected from levels of basic competence. Instead we find that greedy ineptitude is routinely and richly rewarded, if only for the ‘made men’ already at the top and lucky enough to be “too big to fail”. In light of the crash of 2008, any further talk of “the classless society” ought to have us all running for the exits!

Then we come to the other end of our meritocratic muck-heap. And here amongst the human debris we find contradictions of an arguably more absurd kind. I am referring to those disgustingly unworthy winners of our many lotteries – you know the types: petty criminals, knuckle-draggers and wastrels (the tone here is strictly in keeping with tabloid outrage on which it is based) who blow all their winnings on a binge of brash consumerism and a garage full of intoxicants. Conspicuous consumption of the most vulgar kinds! How dare they squander such hard, unearned dosh on having fun! But wait a minute… surely the whole point of running a lottery is that anyone can win. Have we forgotten the advertisement already? So if we are really serious about our meritocracy then perhaps we should to be stricter: no lotteries at all! Yet a cursory consideration of this point presents us with far bigger hurdles by far. For if we are truly committed to the project of constructing a meritocracy (and we must decide precisely what this means), it is vital to acknowledge the fact that life is inherently beset with lotteries. Indeed when roundly considered, this represents an existential dilemma that potentially undermines the entire project.

For life begins with what might best be described as our lottery of inheritance. Where you are born and to whom, the postal code you reside in, the schools you attended, your religious (or not) upbringing, whether you happen to carry one or two x-chromosomes, and the colour of your skin… the whole nine yards. Your entire existence happened by extraordinary chance and each and every aspect of it owes an unfathomable debt to further blind chance.

Therefore, in our most puritanical understanding of meritocracy, lotteries relating to the guessing of random numbers will be abolished altogether, in order to set a precedent, although still these other lotteries, life’s lotteries, remain inescapable. Which is devastating blow to the very concept of fully-fledged meritocracy, since whatever meritocracy we might choose to build will always remain a compromise of one kind or another.

In point of fact, however, we have been moving instead in the completely opposite direction. There has been a tremendous and rapid growth in lotteries of all shapes and sizes: from the casino economy working to the advantage of financial speculators at the top; to the rise of online casinos and the latest betting apps, mathematically honed to suck money from the pockets of the desperate and sometimes destitute pipedreamers at the bottom. Further indications of how far our society truly diverges from even the most rudimentary notions of meritocracy.

So there is plenty of scope for devising a better version of meritocracy; one that isn’t so riddled with blatant inconsistencies and arbitrary rewards. A more refined meritocracy operating according to common sense fairness and consistency, with built-in checks and balances to ensure the winners are more consistently worthy than the losers. A more level playing field bringing us closer to the ideal – for surely a better devised version of meritocracy is the fairest system we can ever hope to live under. In fact, I beg to differ, but before entering further objections to the sham ideal of meritocracy, I wish first to celebrate the different areas in which greater equality has indeed been achieved and ones where it is still dangerously lacking.

During the Q&A session following a lecture entitled “Capitalist Democracy and its Prospect’s” that he delivered in Boston on September 30th, 2014, Noam Chomsky speaks to why the notion of a capitalist democracy is oxymoronic. He also discusses the widespread misinterpretation of Adam Smith’s economic thinking, especially amongst libertarians, and specifically regarding the misuse of his terms ‘invisible hand’ and ‘division of labour’.


There is no denying that at the start of the twenty-first century our own society has, and in a number of related ways, been made fairer and more equal than it was just thirty years ago when I was a school-leaver. Most apparent is the sweeping change in attitudes towards race and gender. Casual racism wasn’t merely permissible in seventies and early eighties Britain, but an everyday part of the mainstream culture. The sporadic Black or Asian characters on TV were neatly allotted into their long-established stereotypes, and comedians like bilious standup Bernard Manning had free rein to defile the airwaves with their popular brands of inflammatory bigotry. Huge strides have been taken since then, and social attitudes are unalterably changed for the better. Today the issue of diversity is central to political debate, and social exclusion on the grounds of race and gender is outlawed.

In the prophetic words of abolitionist preacher Theodore Parker, “the arc of the moral universe is long but it bends toward justice”; words famously borrowed by Martin Luther King in a celebrated sermon he delivered in the year of 1965.3 It was a momentous year: one that marked the official end to racial segregation in the Southern United States with the repeal of the horrendous Jim Crow laws, and the same year when Harold Wilson’s Labour government passed the Race Relations Act prohibiting discrimination in Britain on “grounds of colour, race, or ethnic and national origins”.

On August 28th (last Tuesday) ‘Democracy Now’ interviewed co-founder and chair of the Black Panther Party, Bobby Seale, who was arrested and indicted after speaking outside the 1968 Democratic National Convention in Chicago. He describes how during his trial Judge Julius Hoffman ordered him to be gagged and bound to his chair [from 9:15 mins]:

Did Bobby Seale’s treatment provide inspiration for Woody Allen’s madcap courtroom scene in ‘Bananas’? [from 5:00 mins]:


As Parker and King understood well, of course, the arc of the moral universe does not bend of its own accord but requires tremendous pressure from below. And so it was, again in 1965, after shockwaves sent by Wilson’s government through former colony Rhodesia, that in efforts to avoid the end of its apartheid system, the white minority government under then-Prime Minister Ian Smith, declared independence, and an armed struggle for black liberation ensued. It was a bloody struggle that would grind on throughout the 70s, but one that ended in triumph. Meanwhile, apartheid in neighbouring South Africa outlasted Rhodesia for a further decade and a half before it too was dismantled in 1994 and the rainbow flag could be hoisted.

In solidarity with Nelson Mandela and leading the armed struggle had been Joe Slovo, a commander of the ANC’s military wing Umkhonto we Sizwe (MK) who fought alongside deputy Ronnie Kasrils; both the sons of émigré Jews. Also prominent within the anti-apartheid resistance were other Jewish figures including Denis Goldberg, Albie Sachs, and Ruth First – an activist, scholar and wife of Joe Slovo, she was murdered by a parcel bomb sent to her in Mozambique. Ironically, today Israel stands alone as the last remaining state that legally enforces racial segregation, but even the concrete walls and barbed wire dividing the West Bank and Gaza cannot hold forever.

This video footage was uploaded as recently as Wednesday 29th. It shows a young Palestinian girl living under Israeli control in Hebron having to climb a closed security gate just to get home:

The fence had been extended in 2012 and fitted with a single gate to provide entrance to the Gheith and a-Salaimeh neighborhoods in Hebron. The footage below was recorded by B’Tselem in May 2018 and shows other students unable to return from school and their mothers beseeching the Border Police officers to open it. The officers say in response that the gate is closed as “punishment” for stone throwing; a collective punishment that is prohibited under international law:


Likewise, homosexuality, which until astonishingly recent times remained a virtually unspoken taboo, was decriminalised as comparatively recently as 1967 – the year of my birth and coincidentally the same year aboriginal Australians received full citizenship and the right to vote.

Before the Sexual Offences Act came into force, gay men faced prosecution and a prison sentence (lesbians slipped through the legal loophole due to technicalities surrounding the delicate issue of penetration), whereas today they enjoy the equal right to marriage, which cynics will doubtless say entitles them to an alternative form of imprisonment, but hurrah for that… since irrespective of one’s views on the institution of marriage, equality under law is indicative of genuine social progress. The same goes for the transformation of attitudes and legal framework in countering discrimination on grounds of gender, disability and age. Discrimination based on all these prejudices is plain wrong, and liberation on all fronts, an unimpeachable good.

In these ways, our own society – like others across the globe – has become more inclusive, and, if we choose to describe it as such, more meritocratic. Yet many are still left out in the cold. Which people? Sadly, but in truth, all of the old prejudices linger on – maybe they always will – but prime amongst them is the malignant spectre of racism.

For overall, as we have become more conscious and less consenting of racism than in the past, the racists, in consequence, have adapted to fit back in. More furtive than old-style racism, which wore its spiteful intolerance so brashly on its sleeve, many in the fresh crop of bigots have learned to feign better manners. The foaming rhetoric of racial supremacy is greatly moderated, and there is more care taken to legitimise the targeting of the chosen pariahs. Where it used to be said how “the Coloureds” and “the Pakis” (and other labels very much more obscene again) were innately ‘stupid’, ‘lazy’, ‘doped-up’ and ‘dirty’ (the traditional rationalisations for racial hatred), the stated concern today is in difference per se. As former BNP leader Nick Griffin once put it:

[I]nstead of talking about racial purity, you talk about identity, and about the needs and the rights and the duty to preserve and enhance the identity of our own people.4

And note how identity politics here plays to the right wing just as does to the left, better in fact, because it is a form of essentialism. In effect, Griffin is saying ‘white lives matter’, when of course what he really means is ‘white lives are superior’. But talk of race is mostly old hat to the new racists in any case, who prefer to attack ‘culture’ over ‘colour’.

In multicultural Britain, it is the Muslim minority, and especially Muslim women, who receive the brunt of the racial taunts, the physical abuse, and who have become the most preyed upon as victims of hate crimes, while the current hypocrisy lays blame at their door for failing to adopt western values and mix in; a scapegoating that alarmingly recalls the Nazi denigration and demonisation of the Jews. It follows, of course, that it is not the racists who are intolerant but the oppressed minority who are or who look like Muslims. By this sleight of hand, Islamophobia (a very clumsy word for a vile creed) festers as the last manifestation of semi-respectable racism.

When it was released in 1974, “Blazing Saddles” shocked audiences. It is no less shocking today, but the difference today is that no-one could make it. No contemporary film in which every third word is a vile racist expletive would pass the censors. Yet as it plunges us headlong into a frenetic whirlwind of bigotry, and as all commonsense rationality is suspended, nothing remains besides the hilarious absurdity of racial prejudice. Dumb, crude, and daring: it is comedy of rare and under-appreciated genius. As Gene Wilder puts it “They’ve smashed racism in the face and the nose is bleeding, but they’re doing it while you laugh” [6:15 mins]. Embedded below is a BTS documentary tribute entitled “Back in the Saddle” [Viewer discretion advised]:


“It is only shallow people who do not judge by appearances,” quipped Oscar Wilde.5 And though the accusation at the heart of his bon mot may be contested, that most people certainly do judge by appearances really cannot be. Briefly then, I wish to consider a few of the most overlooked but widespread social prejudices, which though seldom so vicious and of less clear historical significance than other such virulent strains as sexism and racism, are long-standing and ingrained prejudices nonetheless. These tend to be prejudices against certain types of individual, rather than against interconnected “communities”. Prejudices so commonplace that some readers will doubtless see my digression as trivial, or even laughable, and yet there is good reason to delve into the matter as it opens up a bigger question, and, once expanded upon, more fundamentally challenges our whole notion of meritocracy. So here goes… (I am braced for the many titters and guffaws and encourage you to laugh along!)

Firstly, there is a permitted prejudice on the one hand against short blokes (trust me, I am one), and on the other against fat ladies. Short men and fat women being considered fair game for ridicule literally on the grounds that we don’t shape up. Which would be fine – believe me, I can take a joke – except that in playing down the deep-seated nature of such prejudice, as society generally does, there are all sorts of insidious consequences. For it means, to offer a hopefully persuasive example, that whenever satirists (and I use the term loosely, since genuine satire is rather thin on the ground) lampoon Nicolas Sarkozy, rather than holding him to account for his reactionary politics and unsavoury character, they go for the cheaper shot of quite literally belittling him (and yes, prejudice in favour of tallness saturates our language too). Worse still, Sarkozy had the gall to marry a taller and rather glamorous woman, which apparently makes him a still better target for wisecracks about being a short-arse (it’s okay, I’m reclaiming the word). As a result, Sarkozy is most consistently disparaged only for what he couldn’t and needn’t have altered, instead of what he could and should have. No doubt he takes it all on the chin… presuming anyone can actually reach down that far! Yes, it’s perfectly fine to laugh, just so long as we don’t all continue pretending that there is no actual prejudice operating.

Moreover, it is healthy for us to at least admit that there is a broader prejudice operating against all people regarded in one way or another as physically less attractive. Being fat, short, bald or just plain ugly are – in the strictest sense – all handicaps, which, and though far from insurmountable, represent a hindrance to achieving success. Even the ginger-haired enjoy a less than even break, as Neil Kinnock (who was unfortunate enough to be a Welshman too) discovered shortly after he was elected leader of the Labour Party.

Indeed, most of us will have been pigeon-holed one way or another, and though we may sincerely believe that we don’t qualify to be categorised too negatively, our enemies will assuredly degrade us for reasons beyond our ken. But then, could we ever conceive of, for instance, the rise of something akin to let’s say an “ugly pride” movement? Obviously it would be comprised solely of those self-aware and unblinkingly honest enough to see themselves as others actually see them. This envisaged pressure group would comprise an exceptionally brave and uncommon lot.

Then what of the arguably more delicate issues surrounding social class? Indeed, we might reasonably ask ourselves why is there such an animal as social class in the first place? And the quick answer is that people are inherently hierarchical. That “I look up to him because he is upper class, but I look down on him because he is lower class”, to quote again the famous skit from The Frost Report. But now pay proper attention to the vocabulary and its direct correspondence with the physical stature of the three comedians.6


Class and stature side-by-side, just as they are in the dictionary – and as they have been throughout recent history thanks to dietary deficiencies. Here is a visual gag with etymological parallels: the word ‘stature’ itself a double entendre. But, and unlike physical stature, class is already inextricably tied into levels of wealth and success, and virtually impossible to escape in any society – the Soviet system and Mao’s China were arguably more deeply class-riven than our own purportedly “classless” societies.

Incidentally, I in no way advocate the drafting of future legislation to close the gap on these alternative forms of everyday discrimination: demanding social justice for all those with unpopular body shapes, or who speak with the wrong accent, or stutter, or who have chosen to grow patches of hair in the wrong places, or whatever it is (beards became fashionable after I wrote this!). That would instantly make our lives intolerable in another way: it would be (as the Daily Mail loves to point out) “political correctness gone mad!” After all, prejudice and discrimination come in infinite guises, so where could we finally draw the line?

All of which brings me to our last great tolerated prejudice, and one that is seldom if ever acknowledged as a prejudice in the first place. It is our own society’s – and every other society’s for that matter – very freely held discrimination on the grounds of stupidity. And no, this is not meant as a joke. But that it sounds like a joke makes any serious discussion about it inherently tricky.

Because the dim (and I have decided to moderate my language to avoid sounding unduly provocative, which is not easy – I’ll come to other tags I might have chosen in a moment) cannot very easily stand up for themselves, even if they decide to try. Those willing to concede that their lives are held back by a deficit in braininess (sorry, but the lack of more appropriate words is unusually hampering) will very probably fail to grasp much, if anything at all, of the bigger picture, or be able to articulate any of the frustrations they may feel as daily they confront a prejudice so deeply entrenched that it passes mostly unseen. Well, it’s fun to pick on the idiots, blockheads, boneheads, thickos, cretins, dimwits, dunderheads, dunces, knuckleheads, dumbbells, imbeciles, morons, jerks, and simpletons of the world isn’t it? It is the cheaper half of every comedy sketch, and in all likelihood will remain so; with much of the rest that brings us merriment being the schadenfreude of witnessing the self-same idiots cocking up over and over again. And finally, is there really a nicer word that usefully replaces all the pejoratives above? Our casual prejudice against the dim has been indelibly written into our dictionaries.

On May 13th, 1999, comedian George Carlin was invited to deliver a speech to the National Press Club at Washington D.C. He used the occasion to poke fun at the tortuous abuse of language by politicians as well as the growing tyranny of an invented “soft language”, which includes what he describes as ‘the tedious liberal labeling’ of minorities. His speech is followed by an entertaining Q&A session:

Here’s a little more from Carlin dishing the dirt on political correctness:


Now if I’d been writing say a hundred years ago (or even more recently) the available vocabulary would have been a little different. For it was permissible during the first half of the last century to speak and write about the problem of ‘feeble-mindedness’ – a term that implies an innate (and thus inherited) ‘disability’. Moreover, as part of a quasi-scientific conversation, social reformers including intellectuals and political thinkers got into the habit of discussing how this affliction (as it was then regarded) might best be eradicated.

Those on the political left were no less shameful in this regard than those on the right, with radical thinkers like H.G. Wells 7 and George Bernard Shaw, chipping in alongside the youthful Winston Churchill8; all scratching their high brows to think up ways of preventing the spread of such evidently bad stock from ruining good society – ‘the feeble-minded’, for reasons never dwelt on by the pioneering eugenicists, not the least bit incapable of passing on their enfeebled genes.

Thanks again to genuine social progress it is unacceptable to speak (openly) about the elimination of the underclasses in our societies today, or to openly speculate on means of halting their uncontrolled and unwanted proliferation (though I write very much in terms that Wells, Shaw and Churchill would have understood). But eugenics, we should constantly remind ourselves, was a great deal more fashionable not so very long ago – even after the concentration camps and worryingly under alternative names it finds advocates still today (for instance, the Silicon Valley techies gather nowadays for conferences on transhumanism, the artificial ‘enhancement’ of humanity, which is one way in which eugenics has reemerged9).

Today’s progressives (and keep in mind that Wells and Shaw both regarded themselves as progressives of their own times) prefer to adopt a more humanitarian position. Rather than eliminating ‘feeble-mindedness’, the concern is to assist ‘the disadvantaged’. A shift in social attitude that is commendable, but it brings new hazards in its stead. For implicit in the new phraseology is the hope that since disparities stem from disadvantage, all differences between healthy individuals might one day be overcome. That aside from those suffering from disability, everyone has an approximately equivalent capacity when it comes to absorbing knowledge and learning skills of one form or another, and that society alone, to the advantage of some and detriment of others, makes us smart or dim. But this is also false, and cruelly so – though not yet barbarously.

For differences in social class, family life, access to education, and so forth (those things we might choose to distinguish as environment or nurture) are indeed significant indicators of later intellectual prowess (especially when our benchmark is academic performance). So it makes for comfortable presupposition that regarding intelligence (an insanely complex matter to begin with) the inherent difference between individuals is slight, and upbringing is the key determinant, but where’s the proof? And if this isn’t the whole picture – as it very certainly isn’t – then what if, heaven forfend, some people really are (pro)created less cognitively proficient than others? Given that they did indeed receive equivalent support through life, it follows that failure is “their own fault”, is it not?

In any case, intelligence, like attractiveness, must be to some degree a relative trait. During any historical period, particular forms of mental gymnastics are celebrated when others are overlooked, and so instruments to measure intelligence will automatically be culturally biased (there is a norm and there are fashions) to tally with the socially accepted idea of intelligence which varies from place to place and from one era to the next. There can never be an acid test of intelligence in any pure and absolute sense.10

Furthermore, whatever mental abilities happen to confer the mark of intelligence at any given time or place, obviously cannot be equally shared by everyone. As with other human attributes and abilities, there is likely to be a bell curve. It follows, therefore, that whatever braininess is or isn’t (and doubtless it takes many forms), during every age and across all nations, some people will be treated as dimmer, or brighter, than their fellows. And notwithstanding that whatever constitutes intelligence is socially determined to some extent, and that estimates of intelligence involve us in a monumentally complex matter, it remains the case that an individual’s capacity for acquiring skills and knowledge must be in part innate. This admission is both exceedingly facile and exceedingly important, and it is one that brings us right to the crux of meritocracy’s most essential flaw.

For how can those who are thought dim be left in charge of important things? They can’t. Which means that it would be madness to give the dimmest people anything other than the least intellectually demanding jobs. The meritocratic logic then follows, of course, that being less capable (and thus relegated to performing only the most menial tasks) makes you less worthy of an equal share, and yet this cuts tangentially across the very principle of ‘fairness’ which meritocracy is supposed to enshrine. For wherein lies the fairness in the economic exclusion of the dim? To reiterate what I wrote above, our prejudice is so deeply ingrained that to many such exclusion will still appear justified. As if being dim is your own lookout.

For whether or not an individual’s perceived failure to match up to society’s current gauge of intelligence is primarily down to educational ‘disadvantage’ (in the completest sense) or for reasons of an altogether more congenital kind, we may justifiably pass over the comfortable view that equal opportunity (laudable as this is) can entirely save the day. Degrees of intellectual competence – whether this turns out to be more socially or biologically determined – will always be with us, unless that is, like Wells, Shaw and Churchill (together with a many other twentieth century social reformers including Theodore Roosevelt, Woodrow Wilson, Alexander Graham Bell, and the founder of Planned Parenthood, Margaret Sanger) we opt instead for the eugenic solution – and I trust we do not. But bear in mind that programmes of forced sterilisation kept running in regions of the western world long after WWII right up to the 1970s.11 Earlier calls to weed out the “feeble-minded” that never fully went away, but instead went underground.

On March 17th 2016, ‘Democracy Now!’ interviewed Adam Cohen, co-editor of and author of “Imbeciles: The Supreme Court, American Eugenics, and the Sterilization of Carrie Buck”, who explained how:

After World War II, we put the leading Nazis on trial for some of the worst things that the Nazis did. One of those very bad things was they set up a eugenics program where they sterilized as many as 375,000 people. So we put them on trial for that. And lo and behold, as the movie [“Judgment at Nuremberg”] shows, their defense was: “How can you put us on trial for that? Your own U.S. Supreme Court said that sterilization was constitutional, was good. And it was your own Oliver Wendell Holmes, one of your most revered figures, who said that. So, why are we the bad guys in this story?” They had a point.

Click here to watch on the Democracy Now! website.


Now for those further thoughts from the man we might describe as “the father of meritocracy” – even though he would certainly hate it! This is Michael Young speaking out against about his accidental bastard child and the decisive role it is has played in reshaping our societies:

I expected that the poor and the disadvantaged would be done down, and in fact they have been. If branded at school they are more vulnerable for later unemployment.

They can easily become demoralised by being looked down on so woundingly by people who have done well for themselves.

It is hard indeed in a society that makes so much of merit to be judged as having none. No underclass has ever been left as morally naked as that.12

This meritocracy we live in today, as Michael Young points out, is not just a distant remove from the fairest society imaginable, but in other ways – psychological ones especially – arguably crueller than any older, and less enlightened, -ocracies.

Embedded below is one of a series of lectures “Biology as Ideology” given by distinguished geneticist and evolutionary biologist Richard Lewontin in 1990. Lewontin here explaining how erroneous theories of biological determinism have been used to validate and support the dominant sociopolitical theories and vice versa. He also offers his subversive thoughts on meritocracy:


Inevitably, ‘merit’ is equated with, and thus mistaken for, ‘success’, and this is true not only for our self-declared meritocracy, but universally. Think about it: if millions of people love to read your books, or to listen to your songs, or just to watch your delightful face on their TV screens, then who would not leap to the conclusion that what they do is of the highest ‘merit’? How else did they rise to stand above the billions of ordinary anonymous human drones?

The converse is also true. That those who remain anonymous are often in the habit of regarding themselves as less significant – in fact psychologically less real – than others in the limelight they see and admire: the celebrities and the VIPs. Which brings me to a lesson my father taught me; an observation which reveals in aphoristic form the inbuilt fault with all conceptions of meritocracy: VIP being a term that makes him curse. Why? For the clinching fact that every one of us is a “very important person”. If this sounds corny or trite then ask yourself sincerely, as my father once asked me: “Are you a very important person…?”


Famously, Van Gogh sold just a single painting in his lifetime13, but then we all know that millions of terrible painters have also sold one (or less than one!) Not so widely known is that a great deal of Schubert’s music was lost when, in the immediate aftermath of his death, it was recycled as waste paper; but then again, thousands of dreadful composers have also had their music posthumously binned. So the odds are that if you can’t sell your music or publish your book, then you’re just another of the billions, rather than an as yet unappreciated master and another Van Gogh or Schubert. For aside from posterity, and no matter how much we might like to conjure one up, there is no established formula for separating ‘merit’ from ‘success’, and no good reason for supposing we will ever discover such a razor.

In reality therefore, any form of meritocracy will only ever be a form of success-ocracy, and in our own system, money is the reification of success. A system in which success and thus money invariably breeds more success and more money because unavoidably it contains positive and negative feedback loops. For this reason the well-established ruling oligarchies will never be unseated by means of any notional meritocracy – evidence of their enduring preeminence being, somewhat ironically, more apparent in the American republic, where dynasties, and especially political ones, are less frowned upon, and in consequence have remained more visible than in the class-ridden island kingdom it abandoned and then defeated. But even if our extant aristocracies were one day uprooted wholesale, then meritocracy simply opens the way for that alternative uber-class founded by the “self-made man”.

Indeed, ‘aristocracy’, deriving from the Greek ἀριστοκρατία (aristokratia) and literally meaning “rule of the best”, sounds a lot like ‘meritocracy’ to me. Whereas governance by those selected as most competent (the other way ‘meritocracy’ is sometimes defined) is better known by an alternative name too – ‘technocracy’ in this case – with the select order of technocrats working to whose betterment we might reasonably ask. Meritocracy of both kinds – and every meritocratic system must combine these twin strands – has fascistic overtones.

The promise of meritocracy has been seductive largely because of its close compatibility with neoliberalism, today’s predominant, in fact unrivalled, politico-economic ideology. Predicated on the realism that humans do indeed have an ingrained predisposition to social hierarchy (something that traditional concepts of egalitarianism sought to abolish), it offers a reconfigured market solution to foster a sort of laissez-faire egalitarianism: the equalisation of wealth and status along lines that are strictly “as nature intended”. Furthermore, it appeals to some on the left by making a persuasive case for “equality of opportunity”, if always to the detriment of the more ambitious goal of “equality of outcome”. A sidelining of “equality of outcome” that has led to a dramatic lowering of the bar with regards to what even qualifies as social justice.

Moreover, the rightward drift to meritocracy involves the downplaying of class politics in favour of today’s more factional and brittle politics of identity. This follows because under meritocracy the rigid class barriers of yesteryear are ostensibly made permeable and in the long run must slowly crumble away altogether. In reality, of course, social mobility is heavily restricted for reasons already discussed at length. But this abandonment of class politics in favour of the divisiveness of identity politics is greatly to the benefit of the ruling establishment of course. Divide and conquer has been their oldest maxim.

Interestingly, of the many advocates of meritocracy – from Thatcher to Reagan; Brown to Blair; Cameron to Obama; Merkel to May – none have bothered to very precisely define their terms. What do they mean to imply by ‘merit’ and its innately slippery counterpart ‘fairness’? And whilst they talk of ‘fairness’ over and over again – ‘fairness’ purportedly underlying every policy decision they have ever taken – the actual direction all this ‘fairness’ was leading caused a few to wonder whether ‘fairness’ might be wrong in principle! Like other grossly misappropriated abstract nouns – ‘freedom’ and ‘democracy’ spring instantly to mind – the difficulty here is that ‘fairness’ is a handy fig-leaf.

Instead, and if we genuinely wish to live in a society striving for greater equality, then the political emphasis ought not to be placed too heavily on wooly notions like ‘merit’ or ‘fairness’ but upon enabling democracy in the fullest sense. The voice of the people may not be the voice of God, but it is, to paraphrase Churchill (who mostly hated it), the least worst system.14 One person, one vote, if not quite the bare essence of egalitarianism, serves both as a fail-safe and a necessary foundation.

Of course, we must always guard against the “tyranny of the majority” by means of a constitutional framework that ensures basic rights and freedoms for all. For democracy offers an imperfect solution, but cleverly conceived and justly organised neither is it, as so many right-wing libertarians are quick to tell you: “two wolves and a sheep deciding what to have for dinner”. This sideswipe is not just glib, but a better description by far of the extreme right-wing anarchy they advocate. In reality, it is their beloved ‘invisible hand’ that better ensures rampant inequality and social division, and for so long as its influence remains unseen and unfettered, will continue to do so, by rigging elections and tipping the scales of justice.

Democracy – from its own etymology: rule by the people – is equality in its most settled form. Yet if such real democracy is ever to arise and flourish then we must have a free-thinking people. So the prerequisite for real democracy is real education – sadly we are a long way short of this goal too and once again heading off in the wrong direction. But that’s for a later chapter.

Next chapter…


Addendum: our stakeholder society and the tyranny of choice

Prior to the rise of Jeremy Corbyn and to a lesser extent Bernie Sanders (for further thoughts on Sanders read my earlier posts), mainstream politics in Britain and America, as more widely, were converged to such a high degree that opposition parties were broadly in conjunction. Left and right had collapsed to form a single “centrist” amalgam in agreement across a wide range of diverse issues spanning race relations, gender equality, immigration, environmentalism, to foreign policy, and most remarkably, economics. In Britain, as in America, the two major parties ceased even to disagree over the defining issue of nationalisation versus privatisation because both sides now approved of the incorporation of private sector involvement into every area of our lives. “Big government”, our politicians echoed in unison, is neither desirable nor any longer possible. Instead, we shall step aside for big business, and limit ourselves to resolving “the real issues”.

The real issues? Why yes, with the business sector running all the fiddly stuff, governments pivoted to address the expansion of individual opportunity and choice. Especially choice. Choice now became the paramount concern.

Even the delivery of essential public services, once the duty of every government (Tory and Labour alike), began to be outsourced. No holy cows. It became the common doctrine that waste and inefficiency in our public services would be abolished by competition including the introduction of internal markets and public-private partnerships, which aside from helping to foster efficiency, would, importantly, diversify customer choice once again.

Under the new social arrangement, we, the people, became “stakeholders” in an altogether more meritocratic venture. Here is Tony Blair outlining his case for our progressive common cause:

“We need a country in which we acknowledge an obligation collectively to ensure each citizen gets a stake in it. One Nation politics is not some expression of sentiment, or even of justifiable concern for the less well off. It is an active politics, the bringing of a country together, a sharing of the possibility of power, wealth and opportunity…. If people feel they have no stake in society, they feel little responsibility towards it, and little inclination to work for its success. ….”15

Fine aspirations, you may think. But wait, and let’s remember that Blair was trained as a lawyer, so every word here counts. “Sharing in the possibility of power…” Does this actually mean anything at all? Or his first sentence which ends: “…to ensure each citizen gets a stake in it” – “it” in this context presumably meaning “the country” (his subject at the beginning). But every citizen already has a stake in the country, doesn’t s/he? Isn’t that what being a citizen means: to be a member of a nation state with an interest, or ‘stake’ (if we insist) in what goes on. However, according to Blair’s “One Nation” vision, members of the public (as we were formerly known) are seemingly required to become fully paid-up “stakeholders”. But how…?

Do we have to do something extra, or are our “stakeholder” voices to be heard simply by virtue of the choices we make? Is this the big idea? The hows and wheres of earning a salary, and then of spending or else investing it; is this to be the main measure of our “stakeholder” participation? In fact, is “stakeholder” anything different than “stockholder” in UK plc? Or is it less than this? Is “stakeholder” substantially different from “consumer”? According to the Financial Times lexicon’s definition, a stakeholder society is:

“A society in which companies and their employees share economic successes.”16

Well, I certainly don’t recall voting for that.


We are increasingly boggled by choice. Once there was a single electricity supply and a single gas supply – one price fitting all. Now we have literally dozens of companies offering different deals – yet all these deals finally deliver an entirely identical supply of electricity and gas. The single difference is the price, but still you have to choose. So precious moments of our once around the sun existence are devoted to worrying about which power company is charging the least amount. And the companies know all this, of course, so they make their deals as complicated as possible. Perhaps you’ll give up and choose the worst of options – for the companies concerned, this is a winning strategy – thinking about it, this is their only winning strategy! Or, if you are of a mind to waste a few more of your precious never to be returned moments of existence, you may decide to check one of the many comparison websites – but again, which one? Just one inane and frustrating choice after another. And more of those tiresome tickboxes to navigate.

But choice is everything. So we also need to worry more about the latest school and hospital league tables. It is vital to exercise our right to choose in case an actual ambulance arrives with its siren already blaring. In these circumstances we need to be sure that the ambulance outside is bound for a hospital near to the top of the league, because it is in the nature of leagues that there is always bottom – league tables giving a relative assessment, and ensuring both winners and losers.

And provided, an entirely free choice – and not one based on catchment areas – what parent in their right mind elects to send their offspring to a worse school over a better one? So are we just to hope our nearest school and/or hospital is not ranked bottom? Thankfully, house prices save much of the time in helping to make these determinations.

Meantime I struggle to understand what our politicians and civil servants get up to in Whitehall these days. Precisely what do those who walk the corridors of power find to do each day? Reduced to the role of managers, what is finally left for them manage?

And where is all of this choice finally leading? In the future, perhaps, in place of elections, we will be able to voice our approval/dissatisfaction by way of customer surveys. With this in mind, please take a moment to select the response that best reflects your own feelings:

Given the choice, would you say you prefer to live in a society that is:

 More fair

Less fair

Not sure


Please note that for the purposes of ‘publishing’ here I have taken advantage of the option to incorporate hypertext links and embed videos – in order to distinguish additional commentary from the original text all newly incorporated text has been italised.


1 Quote taken from Chapter 10 of George Orwell’s satirical fairytale Animal Farm published in 1945. After the animals have ceased power at the farm they formulate “a complete system of thought” which is designed to unite the animals as well as preventing them from returning to the evil ways of the humans. The seventh and last of these original commandments of ‘Animalism’ is straightforwardly that “All animals are equal”, however, after the pigs have risen to dominance again, the sign is revised and so this last commandment reads “All animals are equal, but some animals are more equal than others”.

2 From an article entitled “Down with meritocracy: The man who coined the term four decades ago wishes Tony Blair would stop using it” written by Michael Young, published in the Guardian on June 29, 2001.

3 Quote taken from a sermon by Martin Luther King Jr. delivered at Temple Israel of Hollywood delivered on February 25, 1965. In fuller context, he said:

“And I believe it because somehow the arc of the moral universe is long but it bends toward justice. We shall overcome because Carlyle is right: “No lie can live forever.” We shall overcome because William Cullen Bryant is right: “Truth crushed to earth will rise again.” We shall overcome because James Russell Lowell is right: “Truth forever on the scaffold, wrong forever on the throne. Yet, that scaffold sways the future and behind the dim unknown standeth God within the shadow, keeping watch above his own.”

An audio recording of King’s speech and a full transcript is available here:

4 Quote taken from a meeting on April 22nd, 2000 with American white supremacist and former Grand Wizard of the Ku Klux Klan, David Duke, that was recorded as “American Friends of the British National Party” video.

In fuller context Griffin says:

“Perhaps one day, once by being rather more subtle we got ourselves in a position where we control the British Broadcasting media and then we tell ’em really how serious the immigration problem was, and we tell them the truth about a lot of the crime that’s been going on, if we tell ’em really what multiracialism has meant and means for the future, then perhaps one day the British people might change their mind and say yes every last one must go.  Perhaps they will one day.  But if you hold that out as your sole aim to start with, you’re going to get absolutely nowhere. So instead of talking about racial purity, you talk about identity, and about the needs and the rights and the duty to preserve and enhance the identity of our own people.  My primary identity quite simply is there (points to veins in wrist). That’s the thing that counts.”

The clip was shown in BBC1’s Panorama: Under the Skin first broadcast on November 25, 2001.

The transcript is available here:

5 Although these words are frequently attributed to Wilde himself, they actually belong to one of his characters. To Lord Henry Wotton who says “To me, beauty is the wonder of wonders. It is only shallow people who do not judge by appearances. The true mystery of the world is the visible, not the invisible.” Taken from Chapter 2 of Wilde’s once scandalous novel The Picture of Dorian Gray.

6 The “Class Sketch” was first broadcast on April 7, 1966 in an episode of David Frost’s satirical BBC show The Frost Report. It was written by Marty Feldman and John Law, and performed by John Cleese, Ronnie Barker and Ronnie Corbett in descending order of height!

7 Anticipations of the Reaction of Mechanical and Scientific Progress upon Human Life and Thought (1901), is one of H.G.Wells’ earliest blueprints for the future. Set in 2000, a youthful Wells (aged 34) suggested an altogether more matter of fact solution to the problem of what he then called “the People of the Abyss” than a promise of education, education, education (the commentary is my own of course):

“It has become apparent that whole masses of human population are, as a whole, inferior in their claim upon the future, to other masses, that they cannot be given opportunities or trusted with power as superior peoples are trusted, that their characteristic weaknesses are contagious and detrimental in the civilizing fabric, and that their range of incapacity tempts and demoralises the strong. To give them equality is to sink to their level, to protect and cherish them is to be swamped in their fecundity…”

Which is putting it most politely! Oh dear, oh dear! What has happened to the clarion call for freedom and equality (and here I mean equality of opportunity, since to be fair Wells was ever the egalitarian, consistently keener on meritocracy than any of the more radical ideals of wealth redistribution). Might it be that the young Mr Wells was showing off his truer colours? Let us go on a little:

“The new ethics will hold life to be a privilege and a responsibility, not a sort of night refuge for base spirits out of the void; and the alternative in right conduct between living fully, beautifully, and efficiently will be to die.”

Just who are the hideous hoards who Wells so pities and despises (in roughly equal measures)? Let us read on:

“…the small minority, for example, afflicted with indisputably transmissible diseases, with transmissible mental disorders, with such hideous incurable habits of the mind as the craving for intoxication…”

But he’s jesting… isn’t he?

“And I imagine also the plea and proof that a grave criminal is also insane will be regarded by them [the men of the New Republic] not as a reason for mercy, but as an added reason for death…”

Death? Why not prison and rehabilitation…?

“The men of the New Republic will not be squeamish either, in facing or inflicting death, because they will have a fuller sense of the possibilities of life than we possess…”

Ah, I see, yes since put like that… yes, yes, death and more death, splendid!

“All such killing will be done with an opiate, for death is too grave a thing to be made painful or dreadful, and used as a deterrent for crime. If deterrent punishments are to be used at all in the code of the future, the deterrent will neither be death, nor mutilation of the body, nor mutilation of the life by imprisonment, nor any horrible things like that, but good scientifically caused pain, that will leave nothing but memory…”

An avoidance of nasty old pain… that’s good I suppose.

“…The conscious infliction of pain for the sake of pain is against the better nature of man, and it is unsafe and demoralising for anyone to undertake this duty. To kill under the seemly conditions of science will afford is a far less offensive thing.”

Death, yes, a more final solution, of course, of course…

This is horrifying, of couse, especially in light of what followed historically.

Deep down Wells was an unabashed snob, though hardly exceptional for his time. Less forgivably, Wells was a foaming misanthropist (especially so when sneering down on the hoi polloi). But mostly he longed to perfect the human species, and as a young man had unflinchingly advocated interventions no less surgical than those needed to cure any other cancerous organ. But then of course, it was once fashionable for intellectual types to seek scientific answers to social problems: programmes of mass-sterilisation and selective reproduction.

His Fabian rival George Bernard Shaw had likewise talked of selective breeding in his own quest to develop a race of supermen, whilst Julian Huxley, Aldous’s big brother, was perhaps the foremost and pioneering advocate of eugenics, later coining the less soiled term ‘transhumanism’ to lessen the post-Nazi stigma. Judged in the broader historical context therefore, Wells was simply another such dreaming ideologue.

That Wells was also one of the first to use the term “new world order” maybe of little lasting significance, however totalitarian his visions for World Socialism, but importantly Wells was never in the position to realise his grander visions, in spite of being sufficiently well-connected to arrange private meetings with President Franklin D. Roosevelt, who entertained him over dinner, and with Joseph Stalin at the Kremlin. Finally, he was unable to inspire enough significant others to engage in his “open conspiracy”.

All extracts below are taken from Anticipation of the Reaction of Mechanical and Scientific Progress upon Human Life and Thought, Chapman & Hall, 1901


Like most of his contemporaries, family and friends, he regarded races as different, racial characteristics as signs of the maturity of a society, and racial purity as endangered not only by other races but by mental weaknesses within a race. As a young politician in Britain entering Parliament in 1901, Churchill saw what were then known as the “feeble-minded” and the “insane” as a threat to the prosperity, vigour and virility of British society.

The phrase “feeble-minded” was to be defined as part of the Mental Deficiency Act 1913, of which Churchill had been one of the early drafters. The Act defined four grades of “Mental Defective” who could be confined for life, whose symptoms had to be present “from birth or from an early age.” “Idiots” were defined as people “so deeply defective in mind as to be unable to guard against common physical dangers.” “Imbeciles” were not idiots, but were “incapable of managing themselves or their affairs, or, in the case of children, of being taught to do so.” The “feeble-minded” were neither idiots nor imbeciles, but, if adults, their condition was “so pronounced that they require care, supervision, and control for their own protection or the protection of others.”

Extract taken from a short essay called “Churchill and Eugenics” written by Sir Martin Gilbert, published on May 31, 2009 on the Churchill Centre website.

9 “Population reduction” is another leftover residue of the old eugenics programme but freshly justified on purportedly scientific and seemingly less terrible neo-Malthusian grounds – when previous “population reduction” was unashamedly justified and executed on the basis of the pseudoscience of eugenics, the pruning was always done from the bottom up, of course.

10 Aside from being the invention of pioneering eugenicist Francis Galton, the IQ test was an pseudo-scientific approach that first appeared to be validated thanks to the research of Cyril Burt who had devised ‘twin studies’ to prove the heritability of IQ. However, those studies turned out to be fraudulent:

“After Burt’s death, striking anomalies in some of his test data led some scientists to reexamine his statistical methods. They concluded that Burt manipulated and probably falsified those IQ test results that most convincingly supported his theories on transmitted intelligence and social class. The debate over his conduct continued, but all sides agreed that his later research was at least highly flawed, and many accepted that he fabricated some data.”

From the current entry in Encyclopaedia Britannica.


Eugenics is now rightly abjured, and if only for its abominable record for cruelty. But the cruelty of the many twentieth century programmes of eugenics was hardly incidental. Any attempt to alter human populations to make them fit an imposed social structure by means of the calculated elimination and deliberate manipulation of genetic stock automatically reduces people to the same level as farm animals.

It should be remembered too that what the Nazis had tried to achieve by mass murder across Europe was only novel in terms of its extremely barbarous method. Eugenics programmes to get rid of “inferior” populations by forced sterilisation having been introduced earlier in America and surreptitiously continuing into the 1970s. For instance, there was a secret programme for the involuntary sterilisation of Native American women long after the World War II.

12 From the same Guardian article entitled “Down with meritocracy” written by Michael Young, published in June, 2001.

13 Van Gogh famously sold one painting during his lifetime, Red Vineyard at Arles. A painting that now resides at the Pushkin Museum in Moscow. The rest of Van Gogh’s more than 900 paintings were not sold nor came to public attention until after his death.


“Many forms of Government have been tried and will be tried in this world of sin and woe. No one pretends that democracy is perfect or all-wise. Indeed, it has been said that democracy is the worst form of government except all those other forms that have been tried from time to time.”

— Winston Churchill in a speech to the House of Commons, November 11, 1947.

15 Tony Blair speaking in Singapore on January 7, 1996.

16 The source for this definition is given as the Longman Business English Dictionary (although the link is lost).

Leave a comment

Filed under analysis & opinion, « finishing the rat race », financial derivatives, neo-liberalism, Noam Chomsky

all work and no play

The following article is Chapter Six of a book entitled Finishing The Rat Race which I am posting chapter by chapter throughout this year (and beyond). Since blog posts are stacked in a reverse time sequence (always with the latest at the top), I have decided that the best approach is to post the chapters in reverse order.

All previously uploaded chapters are available (in sequence) by following the link above or from category link in the main menu, where you will also find a brief introductory article about the book itself and why I started writing it.


BOSWELL, “But, sir, the mind must be employed, and we grow weary when idle.”
JOHNSON, “That is, sir, because others being busy, we want company; but if we were all idle, there would be no growing weary; we should all entertain one another… But no man loves labour for itself.”


Leaving aside the various species of bats and whales, very nearly all mammals are land-dwelling creatures. In fact, nearly all animals – meaning quadrupeds – spend their lives earthbound. For millennia humans too occupied the same earthbound sphere alongside fellow ground-dwelling organisms. So consider then the following: at this precise moment upwards of six thousand scheduled airliners are aloft in our skies, and at peak times as many as ten thousand are flying high above the clouds. Each of these airborne vessels is packed with many hundred perfectly ordinary human beings sat in rows, hurtling above our heads at altitudes exceeding thirty thousand feet and speeds above 500 miles per hour. This sum equates to literally millions of people airborne at each and every moment of each and every day – a significant proportion of the entire human population!

Now consider this: prior to December 17th 1903, only a handful of our species had ever lifted off the surface of the planet by any means at all and not a single human being had ever experienced powered flight. But then, on that fateful day, Orville and Wilbur Wright made three successful flights between them. On his first take-off, Orville covered 120 feet, remaining airborne for just 12 seconds. On the final flight, he valiantly managed 200 feet, all at an altitude of only ten feet. A century on, we have Airbus – take note the humdrum name of the company! – and the launch of its A380, the world’s largest passenger jet, which accommodates between 525 and 850 individuals, and is capable of flying approximately 10,000 miles nonstop. Thus, thanks to technology we have grown wings and been transformed into a semi-airborne species; entirely forgetting to be astonished by this remarkable fact is perhaps the final measure of our magnificent achievement.


“The world is undergoing immense changes. Never before have the conditions of life changed so swiftly and enormously as they have changed for mankind in the last fifty years. We have been carried along – with no means of measuring the increasing swiftness in the succession of events. We are only now beginning to realize the force and strength of the storm of change that has come upon us.

These changes have not come upon our world from without. No meteorite from outer space has struck our planet; there have been no overwhelming outbreaks of volcanic violence or strange epidemic diseases; the sun has not flared up to excessive heat or suddenly shrunken to plunge us into Arctic winter. The changes have come through men themselves. Quite a small number of people, heedless of the ultimate consequence of what they did, one man here and a group there, have made discoveries and produced and adopted inventions that have changed all the condition, of social life.”

These are the opening paragraphs from a lesser-known work by H.G. Wells. The Open Conspiracy, an extended essay written in 1928, was the first of Wells’ most earnest attempts to set the world to rights. Stumbling across it one day, it struck me that this voice from ninety years ago still chimes. I couldn’t help wondering indeed if we aren’t still in the midst of those same “immense changes”, being swept along by an, as yet, undiminished “storm of change”.

Wells, who uses the word ‘change’, in various formulations, no less than seven times (in a mere eight sentences), goes on to compare our modern wonders to the seven wonders of the ancient world, intending to emphasise their novel potency:

“Few realized how much more they were than any “Wonders.” The “Seven Wonders of the World” left men free to go on living, toiling, marrying, and dying as they had been accustomed to for immemorial ages. If the “Seven Wonders” had vanished or been multiplied three score it would not have changed the lives of any large proportion of human beings. But these new powers and substances were modifying and transforming – unobtrusively, surely, and relentlessly – very particular of the normal life of mankind.”

Wells had been trained as a scientist, and more than this, a scientist at a time when science was reaching its apogee. At the Royal College of Science2, he had studied biology under the tutelage of T. H. Huxley, the man who most publicly defended Darwin’s theory. In the debates against the Bishop of Oxford, Samuel Wilberforce, it was Huxley who challenged and defeated the permitted orthodoxy of divine creation by showing how Science makes a better account of our origins than religious authority; so in an important sense, Huxley must be seen as one of the pioneers of this scientific revolution. With religion rather abruptly and rudely dismissed, it was open to the scientists and technologists to lead us all to salvation.

Wells was keen to get involved, if only as one of science and technology’s most passionate and outspoken advocates.  Growing up in late Victorian Britain, he was well acquainted with how systems of mass production had mostly superseded manual methods to become the predominant form of industrial process. Likewise, he had witnessed the spread of agricultural machines for planting seeds and harvesting crops, and of automotive machines transporting loads and providing ever more reliable and comfortable means for human transit. These innovations had led to a dramatic increase both in production and, more importantly, in productivity, and machine processes were set to become ever more versatile and reliable.

Wells was amongst the first to seriously consider how these new modes of manufacture with their greater efficiencies and capacities for heavier constructions, not to mention for longer range transportation and communication, would bring rapid and sweeping changes to ordinary life. Most importantly, he understood that since technology potentially allowed the generation of almost limitless power, its rise would unstoppably alter human affairs forever, and by extension, impact upon the natural world too.

Quite correctly, Wells went on to forecast an age to come (our age), in which ordinary lives are transformed to an extent so far beyond the technological transformations of past ages that life is unutterably and irreversibly altered. Yet the widespread access to these “wonders”, as he insistently calls them, causes us to regard them as so ordinary that we seldom, if ever, stop to wonder about them.

For machines are nowadays embedded quite literally everywhere – one is in fact translating the movement of my fingertips into printed words, whilst another happens to be reproducing the soulful precision of Alfred Brendel’s rendition of one of Franz Schubert’s late sonatas on a machine of still older conception (the piano) via yet another machine that preserves sound in the form of electrical impulses. Thanks to machines of these kinds, not only the sheet-music – those handwritten frequency-time graphs so painstakingly drafted, perhaps by candlelight, and very certainly using only a feather quill and inkpot – but thousands upon thousands of musical (and other) performances can be conjured up with literally “a click”. The snapping fingers of an emperor could never have summoned such variety. But then the internet is a wonder far exceeding even H.G. Wells’ far-seeing imagination.


More than a century ago, the poet, satirist and social commentator Oscar Wilde was another who looked forward to a time of such “wonders”. For Wilde, as for Wells, they presented reasons to be cheerful:

“All unintelligent labour, all monotonous, dull labour, all labour that deals in dreadful things, and involves unpleasant conditions, must be done by machinery. Machinery must work for us in coal mines, and do all sanitary services, and be the stoker of steamers, and clean the streets, and run messages on wet days, and do anything that is tedious and distressing… There is no doubt at all that this is the future of machinery; and just as trees grow while the country gentleman is asleep, so while Humanity will be amusing itself, or enjoying cultivated leisure – which, and not labour, is the aim of man – or making beautiful things, or reading beautiful things, or simply contemplating the world with admiration and delight, machinery will be doing all the necessary and unpleasant work. The fact is that civilization needs slaves… [But] Human slavery is wrong, insecure and demoralizing. On mechanical slavery, on the slavery of the machine, the future of the world depends.”3

Wilde and Wells were optimists, but cautious ones, and each foretold new dangers that potentially lay in wait for us. Wells wrote:

“They [the new “wonders”] increased the amount of production and the methods of production. They made possible “Big-Business,” to drive the small producer and the small distributor out of the market. They swept away factories and evoked new ones. They changed the face of the fields. They brought into the normal life, thing by thing and day by day, electric light and heating, bright cities at night, better aeration, new types of clothing, a fresh cleanliness. They changed a world where there had never been enough into a world of potential plenty, into a world of excessive plenty.”4

Wells believed that the very successes which brought about large-scale manufacturing and distribution, as well as commensurate developments in fields such as agriculture, sanitation and medicine, ones that were already extending the average life-expectancy, might still feasibly bring heavier burdens to bear on the planet. Left unchecked, he argued, our species would finish using up everything, whilst, exponentially crowding ourselves out of existence. So these new “wonders” were a double-edged sword. And then what of “excessive plenty” – of too much of a good thing – how do we avoid replacing one set of miseries with another? Such were Wells’ concerns, but then Wells owed a great deal to the eternal pessimist Thomas Malthus.

By contrast, at the dusk of the Victorian era, Wilde is not much bothered as Wells is, by the prospect of society overrun by a burgeoning and profligate mass of humanity, but by how we can ensure the new prosperity, so long awaited and desperately overdue, could be fairly distributed. After all, progress had until then been primarily technological in form and not social, and it appeared to Wilde that the costs of industrialisation were still hugely outweighing its benefits.

The centuries of Industrial Revolution had claimed so many victims. Not only those trapped inside the mills and the mines, the wage-slaves working all the hours God sends for subsistence pay, but those still more benighted souls incarcerated in the workhouses, alongside their malnourished children, who from ages six upwards might be forced underground to sweat in the mines or else to clamber about in the more choking darkness of chimneystacks.5 Industrial development meant that for the majority of adults and children (boys and girls) life was sunk into a routine of unremitting hardship and ceaseless backbreaking labour, as the poor were ruthlessly sacrificed to profit their masters – one big difference today, of course, is that our own sweatshops are more distant.

To abolish this class-ridden barbarism, Wilde therefore proposed an unapologetically radical solution:

“Up to the present, man has been, to a certain extent, the slave of machinery, and there is something tragic in the fact that as soon as man had invented a machine to do his work he began to starve. This, however, is, of course, the result of our property system and our system of competition. One man owns a machine which does the work of five hundred men. Five hundred men are, in consequence, thrown out of employment, and having no work to do, become hungry and take to thieving. The one man secures the produce of the machine and keeps it, and has five hundred times as much as he should have, and probably, which is of more importance, a great deal more than he really wants. Were that machine the property of all, everyone would benefit by it.”6


In case Wilde’s enthusiasm for collective ownership encourages you think it, then please be assured that he was not exactly a Leninist (as you will see), nor, in any traditional sense, was he a fully-fledged Marxist. In fact, if anything Wilde was an anarchist, heaping special praise on Peter Kropotkin, whom he once described as: “a man with a soul of that beautiful white Christ which seems coming out of Russia.”7

Now it is interesting and worthwhile, I think, to compare Wilde’s views, writing just a few decades earlier, with those of H.G. Wells, for both held notionally left-leaning sympathies and both were broadly hopeful; each underscoring the special importance of science and technology when it came to achieving such desirable goals as ending poverty and rebuilding a fairer society. For in some regards, Wilde’s perspective is orthogonally different to Wells – and it is Wells who made the better communist (though he remained deeply antagonistic towards Marx for other reasons).

For Wells was an unflinching collectivist, and thus forever seeking solutions in terms of strict autocratic control. For instance, in one of the concluding chapters of The Open Conspiracy, Wells outlines “seven broad principles” that will ensure human progress of which the sixth reads as follows:

“The supreme duty of subordinating the personal career to the creation of a world directorate capable of these tasks [ones that will ensure the betterment of mankind] and to the general advancement of human knowledge, capacity, and power”8.

Wilde, on the contrary, unswervingly insisted that above all else the sovereign rights of the individual must be protected. That personal freedom must never be horse-traded, since “the true personality of man”, as he puts it, is infinitely more precious than any amount of prospective gains in comfort and security. This is precisely where Wilde is at his most prescient, foreseeing the dangers of socialist authoritarianism a full two decades before the Russian revolution, and instinctively advising a simple cure:

“What is needed is Individualism. If the Socialism is Authoritarian; if there are governments armed with economic power as they are now with political power; if, in a word, we are to have Industrial Tyrannies, then the last state of man will be worse than the first.”9

So compare Wilde’s earlier views to those of Wells fifty years on, by which time the Soviet model was up and running, and yet he is still advocating the need for a more widespread and overarching central authority: ultimately, a world government to coerce and co-ordinate the masses into the new age of socialism; even to the point of eradicating misfits for the sake of the greater good.

For Wells, every answer for resolving humanity’s problems involved the implementation of top-down governance, with the patterns of individual behaviour controlled by means of an applied political force-field, whereas Wilde was equally insistent that individuals are not uniformly alike like atoms, and must be permitted, so far as is humanly possible, to organise ourselves. It is a fundamental difference in outlook that is reflected in their attitudes towards work.


The inherent value of work is rarely questioned by Wells. In his earlier fictional work A Utopian World he answers his own inquiry “will a Utopian be free to be idle?” as follows:

“Work has to be done, every day humanity is sustained by its collective effort, and without a constant recurrence of effort in the single man as in the race as a whole, there is neither health nor happiness. The permanent idleness of a human being is not only burthensome to the world, but his own secure misery.”10

Wells is expressing a concern that once the labouring masses are relieved of their back-breaking obligation to work, they may “develop a recalcitrance where once there was little but fatalistic acquiescence”:

“It is just because labour is becoming more intelligent, responsible, and individually efficient that it is becoming more audible and impatient in social affairs. It is just because it is no longer mere gang labour, and is becoming more and more intelligent co-operation in detail, that it now resents being treated as a serf, housed like a serf, fed like a serf, and herded like a serf, and its pride and thoughts and feelings disregarded. Labour is in revolt because as a matter of fact it is, in the ancient and exact sense of the word, ceasing to be labour at all.”11

For these reasons, Wells senses trouble ahead, whereas for Wilde, these same changes in modes of employment serve as further reasons to be cheerful:

“[And] as I have mentioned the word labour, I cannot help saying that a great deal of nonsense is being written and talked nowadays about the dignity of labour. There is nothing necessarily dignified about manual labour at all, and most of it is absolutely degrading. It is mentally and morally injurious to man to do anything in which he does not find pleasure, and many forms of labour are quite pleasureless activities, and should be regarded as such. To sweep a slushy crossing for eight hours on a day when the east wind is blowing is a disgusting occupation. To sweep it with joy would be appalling. Man is made for something better than disturbing dirt. All work of that kind should be done by machine.”12

In his essay, Wilde, unlike Wells, is unabashed in confessing to his own Utopianism, writing:

“Is this Utopian? A map of the world that does not include Utopia is not worth even glancing at, for it leaves out one country at which Humanity is always landing. And when humanity lands there, it looks out, and, seeing a better country, sets sail. Progress is the realization of Utopias.”13

But then, both Wilde and Wells were dreaming up Utopias during an age when dreaming about Utopia remained a permissible intellectual pursuit. So it is just that Wilde’s dream is so much grander than any visions of Wells. Wells was certainly an astute forecaster and could see with exceptional acuity what immediately awaited humanity around the next few corners, but Wilde, on the other hand, sought to navigate across a wider ocean. He did not wish to be constrained by the tedious encumbrances of his own time, and regarded the complete abolition of hard labour as an absolutely essential component of a better future. Even then, he was far from alone.


Writing in the thirties, Bertrand Russell was another outspoken advocate of cultured laziness. Russell, who is now venerated by some almost as a secular saint was nothing of the sort. Many of his views on politics and society were highly disagreeable and he was arguably one of the dreariest philosophers ever published, but this aside he was a supreme mathematician. It is noteworthy therefore that in order to support his own expressed desire for reducing the average workload, he did a few very simple sums. These led him to what he regarded as the most important, yet completely overlooked, lesson to be learned from the Great War.

At a time when the majority of the able-bodied population were busily fighting or else engaged in other means of facilitating the destructive apparatus of war, new modes of production had maintained sufficiency, and yet, as Russell pointed out, the true significance of this outstanding triumph of the new technologies was altogether masked by the vagaries of economics. He writes:

“Modern technique has made it possible to diminish enormously the amount of labour required to secure the necessaries of life for everyone. This was made obvious during the war. At that time all the men in the armed forces, and all the men and women engaged in the production of munitions, all the men and women engaged in spying, war propaganda, or Government offices connected with the war, were withdrawn from productive occupations. In spite of this, the general level of well-being among unskilled wage-earners on the side of the Allies was higher than before or since. The significance of this fact was concealed by finance: borrowing made it appear as if the future was nourishing the present. But that, of course, would have been impossible; a man cannot eat a loaf of bread that does not yet exist. The war showed conclusively that, by the scientific organization of production, it is possible to keep modern populations in fair comfort on a small part of the working capacity of the modern world. If, at the end of the war, the scientific organization, which had been created in order to liberate men for fighting and munition work, had been preserved, and the hours of the week had been cut down to four, all would have been well. Instead of that the old chaos was restored, those whose work was demanded were made to work long hours, and the rest were left to starve as unemployed.”

And so to the sums – easy stuff for a man who had previously tried to fathom a complete axiomatic system for all mathematics:

“This is the morality of the Slave State, applied in circumstances totally unlike those in which it arose. No wonder the result has been disastrous. Let us take an illustration. Suppose that, at a given moment, a certain number of people are engaged in the manufacture of pins. They make as many pins as the world needs, working (say) eight hours a day. Someone makes an invention by which the same number of men can make twice as many pins: pins are already so cheap that hardly any more will be bought at a lower price. In a sensible world, everybody concerned in the manufacturing of pins would take to working four hours instead of eight, and everything else would go on as before. But in the actual world this would be thought demoralizing. The men still work eight hours, there are too many pins, some employers go bankrupt, and half the men previously concerned in making pins are thrown out of work. There is, in the end, just as much leisure as on the other plan, but half the men are totally idle while half are still overworked. In this way, it is insured that the unavoidable leisure shall cause misery all round instead of being a universal source of happiness. Can anything more insane be imagined?”

His conclusion is that everyone could and would work a lot less hours, if only the system permitted us to:

“If the ordinary wage-earner worked four hours a day, there would be enough for everybody and no unemployment – assuming a certain very moderate amount of sensible organization. This idea shocks the well-to-do, because they are convinced that the poor would not know how to use so much leisure.”14

It was still only 1932 remember – technology’s “wonders” have moved on a lot since Russell’s day…


Apis mellifera, the honey-bearing bee, is the paragon of industriousness. It’s a pleasure just to watch them humming their way from flower to flower. Working all the hours the apian god sends, without a care in the world. We ascribe tremendous social virtue to our arthropodous familiars, the busy, busy bees. However, if we are to judge bees fairly then we ought properly to consider more critically what it is that our conscientious little friends actually get up to day in, day out…

For though we say that the bees are “at work” – the infertile females who carry out the majority of tasks technically denominated as “workers” – their most celebrated activity, the foraging for nectar from flowers, can hardly be considered a “real job” at all. Unless by “real job” we allow that gorging oneself on the sweetest food available automatically qualifies as work. For, after supping up an abdomenful of nectar (I exaggerate a little for effect), these “workers” then return home to empty the contents of their bellies, as any professional drinker might. Back at the hive, their sister bees also collaborate in the transformation of the incoming nectar, collectively “manufacturing” honey by means of repeated consumption, partial digestion and regurgitation – and apologies to anyone who has suddenly lost their appetite for honey, but bear in mind that milk and eggs are no less strange when you stop to think about them.

By chance, it happens that humans (and other creatures) are partial to the sticky end product of a bee’s binge drinking session. I personally love it. And so we steal away their almost intoxicating amber syrup and attach an attractive price tag to it. The bees receive compensation in the form of sugar, and being apparently unaware of our cheap deception, are extolled as paragons of virtue.

In fact, whenever we take to judging or appraising human conduct of any kind, there is a stubborn tendency to take direction either from Religion, or, if Religion is dismissed, to look for comparisons from Nature. If doing something “isn’t natural”, a lazy kind of reasoning goes, then evidently – evidentially, in fact – there must be something wrong with it. For it cannot be right and proper to sin against Religion or to transgress against Nature. Thus, behaviour that is unorthodox and deviant in relationship to a received normal is denounced, in accordance with strict definition indeed, as perversion.

This fallacious “appeal to nature” argument also operates in reverse: that whenever a particular behaviour is thought virtuous or worthwhile, then – and generally without the slightest recourse to further identifiable evidence – ipso facto, it becomes “natural”. Although of the tremendous variety of human activities, work seems outstanding in this regard. For throughout historic times, societies have consistently upheld that work is self-evidently “natural”; the Protestant “work ethic” is perhaps the most familiar and unmistakeably religious variant of a broader sanctification of labour. Although it is surely worth noting that God’s punishment for Adam’s original sin was that he should be expelled from Paradise “to till the ground from whence he was taken.”15 (Most probably booming “the world doesn’t owe you a living, my son!” before slamming the gates to paradise shut.) Protestant mill-owners, of course, found it convenient to overlook how hard labour was God’s original punishment.

But then, atheistic societies have been inclined to extol work more highly still, and not simply because it is “natural” (the commonest surrogate for Religion), but because atheism is inherently materialist, and since materials depend upon production, productivity is likewise deemed more virtuous and worthwhile. Thus, under systems both Capitalist and Communist, work reigns supreme.

Stalin awarded medals to his miners and his manufacturers – and why not? Medals for production make more sense than medals for destruction. Yet this adoration of work involves a doublethink, with Stalin, for example, on the one hand glorifying the hard labour of labour heroes like, most famously, Alexey Stakhanov, and meanwhile dispatching his worst enemies to the punishment of hard labour in distant work camps, as did Mao and as did Hitler. “Arbeit macht frei” is an horrific lie, yet in some important sense the Nazi leaders evidently believed their own lie, for aside from war and genocide, the Nazi ideology once again extolled work above all else. In the case of Communism, the exaltation of the means of production was to serve the collective ends; in Fascism, itself the twisted apotheosis of Nature, work being natural ensures it is inherently a still greater good.

Yet oddly, whenever you stop to think about it, very little modern humans do is remotely natural, whether or not it is decent, proper and righteous. Cooking food isn’t natural. Eating our meals out of crockery by means of metal cutlery isn’t remotely natural either. Sleeping in a bed isn’t natural. Wearing socks, or hats, or anything else for that matter, isn’t natural… just ask the naturists! And structuring our lives so that our activities coincide with a predetermined time schedule isn’t the least bit natural. Alarm clocks aren’t natural folks! Wake up!

But work is indeed widely regarded as an especially (one might say uniquely) exemplary activity, as well as a wholesomely natural one. Consider the bees, the ants, or whatever other creature fits the bill, and see how tremendously and ungrudgingly productive they all are. See how marvellously proactive and business-like – such marvellous efficiency and purpose! In reality, however, the bees, ants and all the other creatures are never working at all – not even “the workers”. Not in any meaningful sense that corresponds to our narrow concept of “working”. The bees, the ants and the rest of the critters are all simply being… being bees, being ants. Being and “playing”, if you prefer: “playing” certainly no less valid as a description than “working”, and arguably closer to reality once understood from any bee or ant’s perspective (presuming they have one).

No species besides our own (an especially odd species) willingly engages in drudgery and toil; the rest altogether more straightforwardly simply eat, sleep, hunt, drink, breathe, run, swim and fly. The birds don’t do it! The bees don’t do it either! (Let’s leave the educated fleas!) Nature natures and this is all. It is we who anthropomorphise such natural activities and by attaching inappropriate labels transform ordinary pleasures into such burdensome pursuits that they sap nature of vitality. So when Samuel Johnson says, “No man loves labour for itself!” he is actually reminding us all of our true nature.


Whether or not we welcome it, “manpower” (humanpower that is), like horsepower before, is soon to be superseded by machine-power. Indeed, a big reason this profound change hasn’t made a greater impact already is that manpower (thanks to contemporary forms of wage slavery and the more distant indentured servitude of sweatshop labour) has remained comparatively cheap. For now the human worker is also more subtle and adaptable than any automated alternative. All of this, however, is about to be challenged, and the changeover will come with unfaltering haste.

To a considerable extent our switch to automation has already happened. On the domestic front, the transfer of labour is rather obvious, with the steady introduction and accumulation of so many labour-saving devices. For instance, the introduction of electric washing machines, which eliminate the need to use a washboard, to hand rinse or squeeze clothes through a mangle, spares us a full day of labour per week. When these became automatic washer dryers, the only required task was to load and unload the machine. In my own lifetime the spread of these, at first, luxury appliances, is now complete throughout the Western world. Meantime, the rise and rise of factory food and clothing production means ready meals and socks are so inexpensive that fewer of us actually bother to cook and scarcely anyone younger than me even remembers what darning is. The bored housewife was very much a late twentieth century affliction – freed from cooking and cleaning there was suddenly ample time for stuffing mushrooms.

Outside our homes, however, the rise of the machine has had a more equivocal impact. Indeed, it has been counterproductive in many ways, with new technologies sometimes adding to the workload instead of subtracting from it. The rise of information technologies is an illustrative example: the fax machine, emails, the internet and even mobile phones have enabled businesses to extend working hours beyond our traditional and regular shifts, and in other ways, work has been multiplied as the same technologies unnecessarily interfere to the detriment of real productive capacity.

Today’s worker is faced with more assessments to complete, more paperwork (albeit usually of a digital form), more evaluation, and an ever-expanding stack of office emails to handle – enough demands for swift replies to circulars and a multitude of other paper-chasing obligations that we spend half our days stuck in front of a monitor or bent over the office photocopier. Every member of “the team” now recruited to this singular task of administrative procedures.

But these mountains of paper (and/or terabytes of zeroes and ones) needing to be reprocessed into different forms of paper and/or digital records are only rising in response to the rise of the office. In fact, it is this increase in bureaucracy which provides the significant make-weight to mask the more general underlying decline in gainful (meaning productive) employment. Yet still, this growth in administration is a growth that only carries us so far, and a growth that can and ultimately will be eliminated, if not for perfectly sound reasons of practicability, then by automation. Ultimately, office workers are no more immune to this process of technological redundancy than the rest of us.

First broadcast by Channel 4 in 1993, the final episode of Tim Hunkin’s wonderful “Secret Life of the Office” served up a humorous take on the social engineering that led to the Twentieth Century’s rise of the office:


That the robots are coming is no longer science fiction, any more than the killer robots circling high over Pakistan and Yemen armed with their terrifyingly accurate automated AGM-114 Hellfire missiles, are science fiction. In fact, all our future wars will be fought by means of killer robots, and, unless such super-weapons are banned outright or, at the very least, controlled by international treaties, subsequent generations of these ‘drones’ will become increasingly autonomous – the already stated objective is to produce fully autonomous drones; an horrific prospect. It is also a prospect that perhaps most graphically illustrates how sophisticated today’s robotic systems have become, even if, as with all cutting-edge technology, the military enjoys the most advanced systems. In short, the grim robots fleets are with us, and set to become swarms unless nations act to outlaw their deployment, whereas more beneficial robotic descendants still wait more placidly in the wings. The arrival of both fleets heralds a new age – one for the better and one decidedly for the worse.

Of course, the forthcoming workforce of robots might also be for the worse. Yet the choice is ultimately ours, even if we cannot hold off that choice indefinitely, or even for very much longer. For all our robotic rivals (once perfected) hold so many advantages over a human workforce. Never grumbling or complaining, never demanding a pay rise or a holiday, and, in contrast to human drones, never needing any sleep at all, let alone scheming against their bosses or dreaming up ways to escape.

And the new robots will not stick to manufacturing, or cleaning, or farming the land, or moving goods around in auto-piloted trucks (just as they already fly planes), but soon, by means of the internet, they will be supplying a host of entirely door-to-door services – indeed, a shift in modes of distribution is already beginning to happen. In the slightly longer term, robots will be able to provide all life’s rudimentary essentials – the bare necessities, as the song goes. Quietly, efficiently and ungrudgingly constructing and servicing the essential infrastructure of a fully functioning civilisation. Then, in the slightly longer term, robots will be able to take care of the design, installation and upgrading of everything, including their own replacement robots. In no time, our drudgery (as well as the mundane jobs performed by those trapped inside those Third World sweatshops) will have been completely superseded.

This however leads us to a serious snag and a grave danger. For under present conditions, widespread automation ensures mass redundancy and long-term ruin for nearly everyone. And though there are few historical precedents, surely we can read between the historical lines, to see how societies, yielding to the dictates of their ruling elites (in our times, the bureaucrats and technocrats working at the behest of unseen plutocrats), will likely deal with those superfluous populations. It is unwise to expect much leniency, especially in view of the current dismantlement of existing social safety nets and welfare systems. The real clampdown on the “useless eaters” is only just beginning.

It is advisable, therefore, to approach this arising situation with eyes wide open, recognising such inexorable labour-saving developments for what they are: not merely a looming threat but potentially, at least, an extraordinary and unprecedented opportunity. However, this demands a fresh ethos: one that truly values all human life for its own sake and not merely for its productive capacity. More specifically, it requires a steady shift towards reduced working hours and greatly extended holidays: a sharing out of the ever-diminishing workload and a redistribution of resources (our true wealth), which will of course remain ample in any case (the robots will make sure of that).

This introduction of a new social paradigm is now of paramount concern, because if we hesitate too long in making our transition to a low work economy, then hard-line social and political changes will instead be imposed from above. Moves to counter what will be perceived as a crisis of under-employment will mean the implementation of social change but only to benefit the ruling establishment, who for abundantly obvious reasons will welcome the rise in wealth and income disparity along with the further subjugation of the lower classes – the middle class very much included.

As physicist Stephen Hawking said in response to the question “[D]o you foresee a world where people work less because so much work is automated?” and “Do you think people will always either find work or manufacture more work to be done?”

“If machines produce everything we need, the outcome will depend on how things are distributed. Everyone can enjoy a life of luxurious leisure if the machine-produced wealth is shared, or most people can end up miserably poor if the machine-owners successfully lobby against wealth redistribution. So far, the trend seems to be toward the second option, with technology driving ever-increasing inequality.”16

It is an answer that closely echoes Wilde’s foresight of more than a century ago; the difference being one of placing stress. Hawking emphasises the threat of what he calls the “second option”, whereas Wilde encourages us to press ahead in order to realise Hawking’s “a life of luxurious leisure” for everyone.

Of course, there will always be a little useful work that needs doing. Robots will ultimately be able perform all menial, most manual and the vast majority of mental tasks far more efficiently than a human brain and hand, but there will still be the need and the place for the human touch. In education, in medicine and nursing, care for the elderly and sick, and a host of other, sometimes mundane tasks and chores: emotionally intricate, kindly and compassionate roles that are indispensible to keeping all our lives ticking pleasantly along. The big question for our times, however, is really this: given the cheapness and abundance of modern labour-saving equipment, how is it that, even in the western world, instead of contracting, working hours are continuing to rise? The question for tomorrow – one that the first question contains and conceals – is this: given complete freedom and unrestricted choice, what would we actually prefer to be doing in our daily lives? As Bertrand Russell wrote:

“The wise use of leisure, it must be conceded, is a product of civilization and education. A man who has worked long hours all his life will become bored if he becomes suddenly idle. But without a considerable amount of leisure a man is cut off from many of the best things. There is no longer any reason why the bulk of the population should suffer this deprivation; only a foolish asceticism, usually vicarious, makes us continue to insist on work in excessive quantities now that the need no longer exists…”

“Modern methods of production have given us the possibility of ease and security for all; we have chosen, instead, to have overwork for some and starvation for others. Hitherto we have continued to be as energetic as we were before there were machines; in this we have been foolish, but there is no reason to go on being foolish forever.”17


I was about twelve when I took my first flight. It was onboard a Douglas DC9 and I was travelling to Vienna on an exchange trip. I was so excited and not afraid at all – or at least not afraid of the flight. Indeed, I recall how this was the main question older relatives kept asking and I found their obsession puzzling more than anything. But as I have grown older I have sadly developed a fear of flying. This is annoying in the extreme. Why now… when I’m middle-aged and have so much less to lose? But fear is only seldom a purely rational impulse.

Not that it is half so irrational as we are told to have a severe anxiety about being catapulted inside a thin metal capsule six miles up and at close to the speed of sound. Statistics are one thing but being in the presence of sheer physical danger is another. That said, fear of flying is surely as much about loss of control as anything. For why else did my own fear of flying worsen as I got older? Children are more accustomed than adults to feeling powerless, and so better able to relish the excitement of situations totally outside of their control.

Whole societies – or at least majority sections of societies – also suffer with phobias. Like our private fears, these collective fears held by social groups are frequently rooted in some sense of an impending loss of control. Fear of foreigners, fear of financial collapse, and fear of “terror”. But seldom considered is another societal phobia: our collective ‘fear of flying’. Flying in the poetic sense, that is: of fully letting go of the mundane. Instead, it seems our common longing is to be grounded: an understandable desire.

Why else, scarcely a century since the Wright Brothers’ miraculous first flights, do today’s air passengers find flying (that ancient dream) so tiresome that our commercial airlines serve up non-stop distractions to divert attention away from the direct experience? Indeed, listening to those familiar onboard announcements bidding us a pleasant flight, we are inclined (and very likely reclined) to hear the incidental underlying message: “we are sorry to put you through the dreary inconvenience of this journey”.

We fly and yet we don’t fly – or not as those who first dreamt of flight imagined. Flight has instead been transformed from visionary accomplishment into a nuisance and taken entirely for granted by the clock watchers impatiently kicking our heels beneath the slow-turning departure boards.

And just why are today’s airports such sterile and soul-destroying anti-human spaces? Presumably because this is again what modern humans have come to expect! The same can be said for so many facets of modern live. If we can transform the miracle of flight into a chore, then it follows that we can turn just about any activity into one.

Next chapter…


In 1958 Mike Wallace interviewed psychoanalyst and social critic, Erich Fromm. What Fromm says about society, materialism, relationships, religion, and happiness is remarkably prescient, as is his analysis of a growing alienation as we become diminished to the role of products in an age of consumerism:


Addendum: the future of work and Universal Basic Income

Due to its historical roots in workers’ movements18, the political left has tended to hold a somewhat inimical position when it comes to appraising the value of work. The understandable and perfectly legitimate elevation of the worker has had a countervailing effect in terms of accentuating the virtuousness of work per se, thereby adding to the weight of received wisdom that to endure toil and hardship is somehow intrinsically valuable. This is why the left has fallen into the habit of making a virtue out of the central object of the oppression it faces.

So what is the goal of the political left (of socialism, if you prefer)? What is its aim, if not, so far as it is possible, to fully emancipate the individual? For whatever dignifies and ennobles labour, and however understandable it may be as a strategy, to celebrate work for its own sake, disguises the base truth that only seldom is it edifying, and more often just a millstone, frequently a terrible one, which, if we are ever to become truly “free at last”, ought to be joyfully laid aside.

In 2013 Anthropologist David Graeber, professor of anthropology at LSE, wrote an excoriating essay on modern work for Strike! magazine. “On the Phenomenon of Bullshit Jobs” was read over a million times and the essay translated in seventeen different languages within weeks. Embedded below is a lecture Graeber gave to the RSA (Royal Society for the encouragement of Arts, Manufactures and Commerce) to expand on this phenomenon, and explore how the proliferation of meaningless jobs – more associated with the 20th-century Soviet Union than latter-day capitalism – has impacted modern society:

Since writing most of the above chapter the Zeitgeist has shifted remarkably. Suddenly technological unemployment is treated as a serious prospect and debated as a part of a wider political discourse on future trends. Introduced into this new debate, especially on the left, is the proposal for a ‘universal basic income’ i.e., money provided to everyone by the state to cover basic living expenses. Importantly this payment would be provided irrespective of how many hours a person works and has no other (discernable) strings attached.

UBI is certainly a very bold initiative as well as a plausible solution to the diminishing need for human workers in the coming hi-tech era. Unsurprisingly, I very much welcome it, at least in principle, but wish also to offer a small note of caution. Before large numbers of us are to able to live solely by means of a state provided UBI it will be essential to adjust societal norms relating to work. There can be no stigma in idleness. For if UBI is seen as merely a state handout and its recipients as welfare dependents, then we put them all into severe danger.

After all, work historically equates to status and money and until this ingrained relationship is eroded away, anyone subsisting on UBI alone would rather quickly sink to the level of a second-class citizen. Which is why I propose the better approach to UBI must aim to advance by taking baby steps: reducing days and hours, increasing holidays, lowering pensionable age, as well as expanding education – we must in fact think of eventually offering the luxury of lifelong education for all. Given where we start from today, to attempt to leap to it with one giant stride is surely too much of a risk. If UBI is truly our goal then we might reach it best by trimming work back until it barely exists at all.


Please note that for the purposes of ‘publishing’ here I have taken advantage of the option to incorporate hypertext links and embed videos – in order to distinguish additional commentary from the original text all newly incorporated text has been italised.


1 Quotes taken from The Life of Samuel Johnson, LL.D by James Boswell (1791). In the original version, the section substituted by ellipsis reads as follows: “There is, indeed, this in trade:– it gives men an opportunity of improving their situation. If there were no trade, many who are poor would always remain poor.”

2 Now part of Imperial College (my own alma mater).

3 Extract taken from The soul of man under socialism by Oscar Wilde (first published 1891).

4 The Open Conspiracy was published in 1928, subtitled “Blue Prints for a World Revolution”. These extracts are taken from Chapter 1 entitled “The present crisis in human affairs”. Interestingly, in a letter to Wells, albeit a begging letter, Bertrand Russell said of the work: “… I do not know of anything with which I agree more entirely”. The Open Conspiracy was later revised and republished as “What Are We to Do with Our Lives?” in 1931.

5 Many boys and girls suffocated and others fell to their deaths. This was not helped by the practice of master sweeps to light a fire beneath them in order to force them to climb faster.

6 Quote taken from The Open Conspiracy.


“Two of the most perfect lives I have come across in my own experience are the lives of [the French Symbolist poet, Paul] Verlaine and of Prince Kropotkin: both of them men who have passed years in prison: the first, the one Christian poet since Dante; the other, a man with a soul of that beautiful white Christ which seems coming out of Russia.”

Taken from “De Profundis”, meaning literally “from the depths”; Wilde’s celebrated cri de coeur was intended, in part at least, as an extended letter and impassioned rebuke to his lover Lord Alfred Douglas. It was written during his imprisonment in Reading Gaol between January and March 1897, and has since been publicly released in various expurgated versions, the first of which was published in 1905. A complete version was finally released in 1962.


From The Open Conspiracy by H.G. Wells. The full set of seven “broad principles” reads as follows:

(1) The complete assertion, practical as well as theoretical, of the provisional nature of existing governments and of our acquiescence in them;

(2) The resolve to minimize by all available means the conflicts of these governments, their militant use of individuals and property, and their interferences with the establishment of a world economic system;

(3) The determination to replace private, local or national ownership of at least credit, transport, and staple production by a responsible world directorate serving the common ends of the race;

(4) The practical recognition of the necessity for world biological controls, for example, of population and disease;

(5) The support of a minimum standard of individual freedom and welfare in the world; and

(6) The supreme duty of subordinating the personal career to the creation of a world directorate capable of these tasks and to the general advancement of human knowledge, capacity, and power;

(7) The admission therewith that our immortality is conditional and lies in the race and not in our individual selves.

In light of what was about to come, this last item of the seven is perhaps the most perturbing. Wells introduces it as follows:

“And it is possible even of these, one, the seventh, may be, if not too restrictive, at least unnecessary. To the writer it seems unavoidable because it is so intimately associated with that continual dying out of tradition upon which our hopes for an unencumbered and expanding human future rest.”

9 Extract from The soul of man under socialism by Oscar Wilde (first published 1891).

10 From A Modern Utopia by H. G. Wells (published 1905). The same passage continues:

“But unprofitable occupation is also intended by idleness, and it may be considered whether that freedom also will be open to the Utopian. Conceivably it will, like privacy, locomotion, and almost all the freedoms of life, and on the same terms – if he possess the money to pay for it.”

11 Extract from The Open Conspiracy by H.G. Wells (first published 1928).

12 Extract from The soul of man under socialism by Oscar Wilde (first published 1891).

13 Ibid.

14 Extract taken from In Praise of Idleness by Bertrand Russell (1932). Note that Russell’s reference to pin manufacture is a deliberate allusion to Adam Smith’s famous hypothetical pin factory in which he illustrated the benefits of ‘division of labour’ in The Wealth of Nations.

15 From Genesis 3:23 (KJV)

16 In answer to a question posed during a Reddit Ask Me Anything session on October 8, 2015.

17 Extract taken from In Praise of Idleness by Bertrand Russell (1932).

18 Without an upwelling of righteous indignation amongst the oppressed rank and file of working people, no leftist movement would ever have arisen and gained traction. Yet, the political left also owes its origins to the early co-operative movements, a spontaneous awakening of enlightenment humanists, to the Romantics, and most importantly, to fringe religious groups. Tony Benn famously said that the formation of the Labour Party in Britain owed “more to Methodism than Marx”.

In 1832 six agricultural labourers formed a friendly society to protest against their meagre wages. George Loveless, a Methodist local preacher, was the leader of this small union – the other members included his brother James (also a Methodist preacher), James Hammett, James Brine, Thomas Standfield (Methodist and co-founder of the union) and Thomas’s son John. These men were subsequently arrested, convicted and sentenced to transportation. Three years later, and following a huge public outcry which involved a march on London and petitions to parliament, they were issued pardons and allowed to return to England as heroes. This small band of men is now collectively remembered as the Tolpuddle Martyrs.

But the origins of socialism in Britain can be really traced as far back as the English Civil War and indeed earlier again to Wat Tyler’s Peasants’ Revolt of 1381, when the workers of the Middle Ages, inspired by the teachings of the radical priest John Ball, took their demands directly to the King Richard II who reneged on his concessions and had them hunted down.

1 Comment

Filed under analysis & opinion, « finishing the rat race », financial derivatives, neo-liberalism

lessons in nonsense

The following article is Chapter Seven of a book entitled Finishing The Rat Race which I am posting chapter by chapter throughout this year. Since blog posts are stacked in a reverse time sequence (always with the latest at the top), I have decided that the best approach is to post the chapters in reverse order.

All previously uploaded chapters are available (in sequence) by following the link above or from category link in the main menu, where you will also find a brief introductory article about the book itself and why I started writing it.


 Tis strange how like a very dunce,
Man, with his bumps upon his sconce,
Has lived so long, and yet no knowledge he
Has had, till lately, of Phrenology—
A science that by simple dint of
Head-combing he should find a hint of,
When scratching o’er those little pole-hills
The faculties throw up like mole hills.

Thomas Hood1


I am a teacher and so people often ask if like teaching, and sometimes I say I do, but then at other times I tell them I don’t. That’s work basically, except for an exceptional few who truly love, as opposed to merely tolerate, all aspects of the work they have to do. Having said that, teaching is a suitable occupation for me. It keeps me thinking about a favourite subject, and introduces me to some new and interesting people, albeit in rather formal circumstances.

Naturally enough I told myself that I’d never become a teacher – many teachers will say the same, at least when they’re being honest. But that’s work again, unless you’re one of the fortunate few. So what’s my beef? Well, just that really. Here I am being honest with you and yet I know that what I’m saying isn’t enough. Okay, let me expound more fully.

A few years ago I was offered redundancy and accepted. So I was back on the shelf. Needing another job and to give myself any realistic chance of success, I’d have to recast myself somewhat. Imagine if I turned up at the interview and I said more or less what I’ve just told you.

“Tell us why you want the job”, they’d ask, and my honest answer: “I need some money. I’m a decent teacher and I have a firm grasp of my subject. This could be one of the best offers I’ll get…” Well it just won’t do. No, as I say, I’d need to recast myself. Something more like this:

“I’m a highly experienced professional, looking for an exciting new challenge. I enjoy working as part of a team. I have excellent communication skills. I have excellent organisational skills. I have excellent people skills. I have excellent skills in personally organising communications. I have excellent skills in communicating to organised persons. I have excellent skills in organising communications personnel. Because of the outcomes-based nature of my teacher-training programme, I have developed a thorough understanding of the collection of evidence and portfolio-based approach to assessment. I’m very good at filing. I welcome the opportunity to work with students of different ages, cultures, ethnicities, genders and sexual orientations. I believe that I am ideally suited to the post of part-time classroom assistant and I want to have your babies…”

Well okay then, just try getting a job if you say otherwise.


I used to work in the public education sector. I ought perhaps to protect the name of the establishment itself, so let’s just say that for almost a decade I lectured A-level physics to a mix of students, with a range of abilities and nationalities, in a typical northern town… which covers the CV more or less.

As with every other college and university today, we were quite literally in the business of education; further education colleges having been “incorporated” by John Major’s government under the Further and Higher Education Act (FHEA) of 1992. Once at a meeting I was informed of my monetary value to the institution (which wasn’t much). Because the most important thing was that the college had to break even, although, as time went on, it rarely did.

Being in business also meant dealing with competition – primarily from other local schools and colleges. “Promotion”, then, which happens to be one of “the four Ps of marketing”2, involved pitching our unique selling points – in this case, a national BTEC diploma in forensic science which was ideal for attracting budding students away from the latest series of CSI: Crime Scene Investigation and daytime re-runs of Quincy.

Meanwhile, an impressive new body of staff dedicated to marketing and publicity had to be gradually assembled, and then another sizeable team assigned to deal with “student services”. It was the marketing department who coined our corporate mission statement: “Meeting learner needs and aspiring to excellence”, which as a dedicated workforce we committed to memory, to draw upon for inspiration during dreary afternoon classes in Key Skills Information Technology.

But no college is just a business, and in spite of appealing to a foreign market (a small number of students having been attracted from as far afield as China and Hong Kong), by far the biggest part of each year’s fresh cohort were home students, with funding provided out of the public purse. So the regulatory agency Ofsted with its own teams of inspectors would come now and then to tick their own assessment boxes. The “quality of learning provision” was not apparently guaranteed by market forces alone, because our adopted business model only went so far – markets are generally supposed to ensure quality too, but not in education.

So to ensure that our annual government targets were being reached, new management roles in “quality assurance” also opened up. The further paperwork, combined with already tight budgets made tighter by administrative growth, meant it was harder again to actually balance the books, or at least to reduce the losses. Eventually, a firm of management consultants were hired, and then another firm, putting together reports that were either promptly forgotten or used to justify the multiplication of methods for cutting costs: these included laying off more teaching staff and generating yet more paperwork. A vicious circle justified on the basis of ‘quality’ and ‘efficiency’ had resulted in conditions for both staff and students that simply got worse and worse.

So it’s funny to remember a time, not very long ago, when colleges had operated with hardly any management or administrative staff at all. The odd secretary, a few heads of section, and a principal were quite sufficient to keep the wheels turning in most educational establishments. Whereas, as the very model of modern FE college, plagued by bureaucratic waste and inefficiency, hampered at every turn by tiers of micro-management, there was insufficient funding for the real business of education. John Major’s incorporation of the FE sector had also led to year-on-year declines in real-terms wages for the teaching staff, who were increasingly made to feel like an unwanted overhead. “Struggling to survive and steadily achieving less” is not a mission statement, but it would at least have been more honest and to the point. Or, alternatively, I suppose we could have gone with: “do we look bothered?”


In one way, the problem here goes back all the way to Isaac Newton, and then to just a little before him. It was Newton, after all, who had decisively proved a truth that, whenever I pause to reflect on it, I still find rather startling: that the universe behaves according to elegant mathematical laws. Little surprise then, that following the unprecedented success of Newton’s approach to establishing universal laws that had so elegantly replaced the everyday disorder of earlier natural philosophies, those working in other fields, would also try out the Newtonian approach of quantifying, theorising and testing: intent upon finding equivalent fundamental laws that operate within their own specialisms. Scientists were to become the high-priests and priestesses of our post-Newtonian age, so what better model to follow?

But why does science work at all? Is it simply that by applying careful measurement and numerical analysis, we might make smarter decisions than by using common sense alone, or that the universe really is in some sense mathematically accountable? That it works because God is inexplicably into algebra and geometry. The truth is that no-one knows.

But if the universe were not conducive to such logical and numerical analysis, then natural phenomena could be measured, data collected and collated, and yet all of this cataloguing would be to no avail. For outcomes can be forecast, within limits that can be precisely determined too, only because maths accurately accounts for the behaviours of atoms, and forces, and so forth. God may or may not play dice (and the jury is perhaps still out when it comes to the deeper philosophic truth of quantum mechanics) but when you stop and think about it, it’s strange enough that the universe plays any game consistently enough for us to discover the rules to it. “The most incomprehensible thing about the world,” said Einstein, “is that it is comprehensible.”

So what of the experts in the other widely varied disciplines? Disciplines rather more susceptible to the capriciousness of our human follies and foibles. Ones that are now called the ‘social sciences’, and following these to still lower rungs, the so-called theories of management and business. Taking their lead from Newton, experts in all these fields have turned to quantification, to the collection and collation of data, setting off with these data to formulate theories which are in some sense assumed universal – ‘theory’, in any case, being a word that takes a terrible bashing these days.

In Science, the measure of any theory is in two things: predictability and repeatability, because any scientific theory must allow some way for itself to be tested – and here I mean tested to destruction. If rocks didn’t fall to Earth with constant acceleration then Newton would be rejected. If the Earth didn’t bulge at the equator, if the tides didn’t rise and fall as they do, and if for other reasons Newton couldn’t account for the extraordinary multiplicity of natural phenomenon, then Newton must step aside – as Newton finally has done (to an extent). But where is the predictability and repeatability in the theories of the social sciences or taught in the business and management schools?

About two centuries ago in the early eighteen hundreds, a German physician named Franz Joseph Gall noticed that the cerebral cortex (the so-called ‘grey matter’) of humans was significantly larger than in other animals. Naturally enough, he drew the conclusion that it must be this exceptional anatomical feature that made humans intellectually, and thus morally, superior.

Gall also became convinced that the physical features of the cortex were directly reflected in the shape and size of the skull. Concluding that since the shape of the outside of the cranium is related to the shape of the inside, and thus to the general structure of the cerebral cortex, then the bumps on someone’s head ought to be a potentially decipherable indicator of the way that person thinks, and therefore a sign of their innate character.

Gall’s ideas led to the discipline known phrenology – the reading of the bumps on your bonce – which became a popular and rather serious area for study. Throughout the Victorian era, but especially during the first half century of the nineteenth century, there were phrenological experts aplenty, and after more careful researchers had proved wrong Gall’s basic premise, by showing that the external contours of the skull did not in fact closely match the shape of the brain, phrenology did not immediately lose all of its appeal; a few diehards continuing to study phrenology into the early years of the twentieth century.

In an important sense, we might be well advised to recognise that such people really were ‘experts’, just as informed about the detailed ins and outs of their subject as any expert must be, and, perhaps more importantly, able to speak its language. That phrenology is actually bunkum, and that its language is therefore pure and unadulterated gobbledegook, doesn’t in fact make them any lesser experts in their field. Indeed, it’s all-too easy to forget that considerable training and painstaking effort is almost always necessary if one is to become a competent specialist in the fashionable nonsense of the day.


Richard Feynman, who was undoubtedly one of the greatest of modern physicists, got especially upset by what he saw as the increasing misappropriation of supposed scientific method in areas outside of scientific scope. He coined the useful term “cargo cult science”, drawing a parallel with the stories of Pacific Islanders who, after the Allies departed at the end of the war, had mocked up the old airstrips and acted out the same rituals they had witnessed, with headphones and aerials made of bamboo or whatever, desperate in the hope that they would bring the cargo planes back. Obviously, it didn’t work, any more than flapping your arms is enough to make you fly.

Feynman’s point was that the same goes for science and scientific method. That merely doing and re-doing the things that the scientists also do is not enough to make you a real scientist. Testing something you’ve called ‘a hypothesis’ doesn’t automatically ensure that your results will be any more valid. Whilst correlation is never a sufficient proof of causation. But Feynman also makes a more important point. That as a scientist, you must always have in the back of your mind, thoughts about the billion and one ways you might be wrong: science being founded upon uncertainty and rigorous empirical testing. Indeed, Feynman goes on to say that science requires a special kind of integrity, an honesty that is far beyond the honesty expected in everyday relationships, even when dealing with the most saintly of people. Scientific integrity requiring not simply that one sticks to the truth as found, but, that in addition, one must acknowledge every reasonable doubt against your own beliefs or theories in whatever ways they fail to account fully for that discovered truth. Nothing less than this will do:

“It’s a kind of scientific integrity, a principle of scientific thought that corresponds to a kind of utter honesty – a kind of leaning over backwards. For example, if you’re doing an experiment, you should report everything that you think might make it invalid – not only what you think is right about it: other causes that could possibly explain your results; and things you thought of that you’ve eliminated by some other experiment, and how they worked – to make sure the other fellow can tell they have been eliminated.

“Details that could throw doubt on your interpretation must be given, if you know them. You must do the best you can – if you know anything at all wrong, or possibly wrong – to explain it. If you make a theory, for example, and advertise it, or put it out, then you must also put down all the facts that disagree with it, as well as those that agree with it. There is also a more subtle problem. When you have put a lot of ideas together to make an elaborate theory, you want to make sure, when explaining what it fits, that those things it fits are not just the things that gave you the idea for the theory; but that the finished theory makes something else come out right, in addition.”3

Feynman properly gets to the heart of what it means to commit oneself to the call of science. For most professions may indeed be “conspiracies against the laity”, as George Bernard Shaw once famously wrote – most pointedly with regards to the profession of medical doctors4 – but a committed scientist (and Feynman is a wonderful example) has no interest in deception. Deception, and its partner in crime, delusion, being precisely what science is objectively attempting to eliminate:

“I would like to add something that’s not essential to the science, but something I kind of believe, which is that you should not fool the layman when you’re talking as a scientist. I am not trying to tell you what to do about cheating on your wife, or fooling your girlfriend, or something like that, when you’re not trying to be a scientist, but just trying to be an ordinary human being. We’ll leave those problems up to you and your rabbi. I’m talking about a specific, extra type of integrity that is not lying, but bending over backwards to show how you are maybe wrong, that you ought to have when acting as a scientist. And this is our responsibility as scientists, certainly to other scientists, and I think to laymen.

“For example, I was a little surprised when I was talking to a friend who was going to go on the radio. He does work on cosmology and astronomy, and he wondered how he would explain what the applications of this work were. “Well,” I said, “there aren’t any.” He said, “Yes, but then we won’t get support for more research of this kind.” I think that’s kind of dishonest. If you’re representing yourself as a scientist, then you should explain to the layman what you’re doing – and if they don’t want to support you under those circumstances, then that’s their decision.”5


Feynman then goes on to make comparison between the modern purveyors of the various kinds of pseudoscience with earlier witch doctors, although he might instead have said ‘high priests’. And it is important to understand that he is not necessarily saying that the witch doctor or the high priest is a deliberate charlatan, for it may well be that such proponents, the Victorian phrenologists providing again a helpful illustration, vehemently believe in their own quackery. The bigger point he makes being that such systems, or ‘theories’, lack the essential ingredient to make them authentically scientific.

Here, then, is Feynman making a personal assessment of how cargo cult science was already being used to mould society and to shape our lives as long ago as 1974:

“But then I began to think, what else is there that we believe? (And I thought then about the witch doctors, and how easy it would have been to cheek on them by noticing that nothing really worked.) So I found things that even more people believe, such as that we have some knowledge of how to educate. There are big schools of reading methods and mathematics methods, and so forth, but if you notice, you’ll see the reading scores keep going down – or hardly going up in spite of the fact that we continually use these same people to improve the methods. There’s a witch doctor remedy that doesn’t work. It ought to be looked into; how do they know that their method should work? Another example is how to treat criminals. We obviously have made no progress – lots of theory, but no progress – in decreasing the amount of crime by the method that we use to handle criminals.

“Yet these things are said to be scientific. We study them. And I think ordinary people with commonsense ideas are intimidated by this pseudoscience. A teacher who has some good idea of how to teach her children to read is forced by the school system to do it some other way – or is even fooled by the school system into thinking that her method is not necessarily a good one. Or a parent of bad boys, after disciplining them in one way or another, feels guilty for the rest of her life because she didn’t do “the right thing,” according to the experts. So we really ought to look into theories that don’t work, and science that isn’t science.”6

Before you apply any theories to education then, or offer diagnoses in other social spheres, how can we be certain that, as Feynman puts it, “those things it fits are not just the things that gave you the idea for the theory”? And how do you know that it also makes “something else come out right, in addition”? For what does it actually mean to improve education – is this something that can be so very precisely determined? Some will argue that we can judge from “success rates”, but then every ‘measure of success’ will inevitably be predefined within an established paradigm: an orthodoxy that is itself unchallenged. Of course, science has the remarkable property of re-setting its own paradigms, as its own extraordinary history amply demonstrates, but do the models used in sociology, pedagogy, management practice and business also have this property?

More generally, when the experts in business and management theory have established the rules, what proof do they have that these are not merely rules to games of their own making? How indeed do they show that their preferred management system is better or optimal? Well, most often it will mean simply looking at profits. The bottom-line. Money being the safest and surest instrument when you demand a purely numerical answer in evaluations of improvements within a society. Money appearing to be the ‘hardest’ measure available, but the question then should be, what does it actually measure? I will save my own thoughts on that for a later chapter.


What is education? Here’s my first stab: education is a method of communicating skills or ideas to another person. Or here’s a dictionary definition: “systematic instruction”; “development of character or mental powers”. Yes, a system for helping minds to develop – that sounds about right. So what’s required then to successfully educate our population? Well, I’d suggest that it boils down to more or less two preferred ingredients: i) interested students and ii) teachers who are both able and willing to teach. To help this process to work a little better we ought obviously to try to increase the likelihood of successful transmission of key information and skills, so it will certainly be helpful if the ratio of interested students to dedicated teachers is kept on the low side (I’d say from experience about 10:1 is a good number). Do we need to constantly assess the quality of this learning provision? Well isn’t that the purpose of final exams, which seem to be an unfortunate but necessary evil in any formal system of education.

But now I would like to go further again. For any approach to education that puts so much emphasis on ensuring ‘quality’ misses the point. Learning is a very different process to the manufacturing of parts on a production line. So if we apply the assembly line model (and to a great extent we do precisely this), at best our students will be turned out like precisely engineered cogs and, at worst, they may be turned into spanners. There just has to be a better approach – a frankly more laissez-aller approach.

Let’s go back to our own beginnings, and try to remember how wonderful it was when we felt the awakening of such fabulous new powers as walking and talking. Everything in our lives follows from those original awakenings, of finding first our feet and then our voices, and all the most valuable lessons in our lives have in some way or another continued that process of awakenings. Yet these two universal feats – achieved by literally everyone on earth who isn’t suffering from a serious physical or mental handicap – are almost impossibly complex and subtle achievements. Just think how difficult it is to learn a second language, and yet, you learned the basics of your native tongue with almost no direct training. So we are all born with the greatest capacity for learning; and we might better think of children as little learning machines (except not machines, of course, that’s the point). Rather, children learn in much the same way that caterpillars chew leaves: they just can’t help nourishing themselves with juicy knowledge.

Not that I’m claiming education is necessarily easy. It isn’t. There are usually growing pains too. It is unpleasant to discover that your ideas are incorrect, and yet correcting established prejudices and erroneous presumptions is at the heart of all true learning. Indeed, learning is probably a difficult and tedious thing more often than it’s a pleasure – and especially so as we get older and the things we need to first unlearn have become so deeply engrained that it seems a trauma to erase them. But learning, like most activities, should be enjoyable wherever and whenever this is possible. Why would anyone wish to make it otherwise?

In my own experience as a teacher, what matters most, assuming that the student is keen and relatively able, is persistence and encouragement – and certainly not tests and assessment. And whilst obviously people need to be able to read and write and add up and do all the other basic stuff necessary to function in society, just as we all need air and water for our bodies, education, if it is to be most nutritional, must also develop our higher faculties. It should expand a student’s scope not merely for interpreting the world about them and developing abilities to express whatever thoughts they have about it, but of heightening responsiveness. Because, and increasingly this is forgotten, education is so much more than training, as important as training can be – society needs its plumbers, but it needs its poets too.

That education is the cornerstone to a functioning democracy is a commonplace, yet just behind the platitude lies a richer vision of what we might mean by education. For democracy in the truest sense depends upon an enlightened version of education, which provides not only a safeguard against the social curses of ignorance, but which promotes knowledge and understanding because these are prerequisites for individual freedom, and by extension, for ensuring political freedom more generally. Happily, a more enlightened education of this kind is also a lifelong blessing for all who receive it. Better still, if real education makes the world more interesting and enjoyable, as it should, then this in turn makes for a more interesting and enjoyable world. Let’s take things from there.

American psychologist James Hillman was first Director of the Jung Institute in Zurich and is acknowledged as the founder of a movement known as archetypal psychology. Here he speaks to the central importance of imagination in all teaching and learning and explains how we might surmount some of the barriers that inevitably arise when education is formalised:


I nearly forgot to mention what happened a few years ago. We had a change of principal at the college. The old guy who was loathed and feared suddenly retired and was replaced by a bright young turk. One day, our new principal arranged a meeting and told us all about the exciting future that lay ahead. Gone were the days of tedious education, soon we would welcome in the brave new world of ‘edutainment’ and ‘leisurecation’:

“I once saw a guy teaching teaching physics by lying on a bed of nails”, he told us enthusiastically, by way of an example… hand on heart, I’m not making any of this up!

Thankfully we never introduced either ‘edutainment’ or ‘leisurecation’, for if indeed these terms can be translated into anything at all meaningful, then it is simply this: use any tricks at all to distract the students from the necessary exertions of learning. Even if that means bringing a bed of nails into the classroom. After all, they’re the customers.

Well, our new principal had the ear of the then-Secretary of State for Education, or so he informed us, and she was sold on his grand designs. The old buildings, he said, were riddled with concrete cancer and asbestos, but in a couple of years we’d be relocated to a brown field site on the other side of town becoming “the world’s first multiversity” – £100 million rings a vague bell. And yes, he said that too, “multiversity”. He was never short on portmanteau neologisms.

We did relocate and it did cost a small fortune, more than enough to break the bank. Soon after, our bright young principal relocated himself, jumping ship in the nick of time, having been handsomely rewarded (in spite of his failures) with promotion to the post of Vice-Chancellor at one of the new universities. Meantime, others who had attended his meeting were left to foot the bill, accepting another pay-freeze, and then cajoled into teaching longer hours to larger classes for improved “efficiency”, which meant, as a direct consequence, struggling with more paperwork than ever. All this was, of course, again to the detriment of both staff and students.

But soon there came a more certain nail to our coffin, with one of the mandatory Ofsted inspections reaching the conclusion that the college was failing. Their reason? Although teaching and learning had been passed as satisfactory (and please note that this was before Ofsted downgraded their ‘satisfactory’ grade to mean unsatisfactory! 7), the college failed instead on grounds of poor leadership and management. Although we’d hardly needed Ofsted to tell us that…

So the management took the hit, right? Well, not exactly. The disappointing Ofsted results now allowed an already bloated and overbearing system to be expanded. As new intermediate tiers of management were hastily installed, those teaching were soon faced with extra hoops and hurdles. More management, not less, was the only way to redress the failures of leadership – turkeys being disinclined to vote for Christmas – and inevitably this meant a commensurate growth in paper-chasing checks on quality assurance and target attainment. For an already overstressed and deeply demoralised teaching staff it was more than too much, and that’s why so many of us grabbed the offer of redundancy cheques and headed for the exits (staff redundancies being another part of this new drive for ‘increased efficiency’). If I have any personal regret, it is only that I couldn’t have escaped sooner.

Next chapter…


Addendum: could do better…

Earlier I posed the rhetorical question: “what does it actually mean to improve education?” Because when considered in general terms here is a question that is next to impossible to answer. However, anyone teaching a specialist subject will have a good idea of whether standards in schools and colleges have been rising or falling in their own discipline.

Over the period of more than a decade, I can personally testify to a steady decline in standards in my own subjects (physics and maths). In parallel with reductions in technical difficulty, there has been a commensurate lowering in the level of grades. Changes that were well underway and long before I first called a register.

Indeed, our long leap backwards undoubtedly began when O-levels were replaced by GCSEs. A more steady atrophy has continued ever since, the downward impetus given an occasional helping hand, as with, for instance, the introduction of AS-grades. In physics, the current AS is now around O-level standard (in fact probably lower than that), which means that, unless we now teach A-level twice as efficiently (and we don’t) the standard of the full A-level has drastically fallen.

If you think I’m being unfair and nostalgic, then I recommend that you do a little research of your own. Pick up any GCSE textbook and compare it to a textbook from twenty-odd years ago. The differences are immediately obvious – again, in my own subjects – and these aren’t merely differences in style (something that is likely to shift over time) but in content too – both in breadth and in depth. If you still remain unconvinced after perusing a textbook or two, then I’d further advise that you take a look at an old-style exam paper. Is the difficulty of an exam paper today really equivalent to a paper from twenty years ago? The quick answer is no.

Here is an interactive maths quiz which allows comparison between GCSE and O-level style questions. It was published in The Telegraph back in August 2013.

And it is not simply that the level is lower but that later papers are much more structured than older ones. Questions that once existed as whole puzzles waiting to be unravelled, are today parcelled neatly into bite-sized pieces, and always with each of the parts correctly sequenced:

“GCSEs and A-levels in science and geography are easier than they were 10 years ago, the exams regulator has said. Standards have slipped, with teenagers often facing more multiple choice and short structured questions and papers with less scientific content, according to reports published by Ofqual. The watchdog conducted reviews of GCSEs and A-levels in biology and chemistry between 2003 and 2008 as well as A-level geography between 2001 and 2010 and A-level critical thinking in 2010. The findings show that among the GCSEs, changes to the way the exams were structured had ‘reduced the demand’ of the qualifications, while the A-level reviews found that changes to the way papers were assessed had in many cases made them easier.”8

This was the verdict of the government’s own watchdog Ofqual making an assessment in 2012 of the relative standard of GCSE’s and A-levels compared to those taken just one decade before. In short, there is little point in denying that standards have fallen. As for the biggest official giveaway – well, that was surely the introduction of the GCSE A* grade. Likewise, the lead guitarist of spoof rock band Spinal Tap had the knob on his amplifier recalibrated to go up to level eleven!9


A few years ago (you’ll see more precisely when as you read on) I happened to be working at a university laboratory, when I came across the following joke. It’s a good one:

Teaching Maths In 1970

A logger sells a lorry load of timber for £1000.

His cost of production is 4/5 of the selling price.

What is his profit?

Teaching Maths In 1980

A logger sells a lorry load of timber for £1000.

His cost of production is 4/5 of the selling price, or £800.

What is his profit?

Teaching Maths In 1990

A logger sells a lorry load of timber for £1000.

His cost of production is £800.

Did he make a profit?

Teaching Maths in 2000

A logger sells a lorry load of timber for £1000.

His cost of production is £800 and his profit is £200.

Your assignment: Underline the number 200.

Teaching Maths in 2009

A logger cuts down a beautiful forest because he is totally selfish and inconsiderate and cares nothing for the habitat of animals or the preservation of our woodlands.

He does this so that he can make a profit of £200. What do you think of this way of making a living?

Topic for class participation after answering the question: How did the birds and squirrels feel as the logger cut down their homes? (There are no wrong answers. If you are upset about the plight of the animals in question counselling will be available).

Multiple copies had been printed out on A4 and left on one of the lab benches (perhaps accidentally on purpose – who knows?). But evidently someone at the university was having a good old laugh at the state of the nation’s education system.


Please note that for the purposes of ‘publishing’ here I have taken advantage of the option to incorporate hypertext links and embed videos – in order to distinguish additional commentary from the original text all newly incorporated text has been italised.


1 Thomas Hood, Craniology, reported in Hoyt’s New Cyclopedia Of Practical Quotations (1922), p. 597.

2 The so-called 4 Ps of marketing were Product, Price, Promotion and Place. This is the so-called “producer orientated model” but after decades of research, it was revised to become the so-called 4 Cs of the “consumer-orientated model”, with the original Ps replaced respectively by Consumer, Cost, Communication, Convenience. In truth, of course, ‘Promotion’ has very little to do with actual ‘Communication’ and much more to do with Edward Bernays’ long-since abandoned P: ‘Propaganda’.

3 Extract taken from Cargo Cult Science by Richard Feynman. Adapted from the Caltech commencement address given in 1974.

4[But] the effect of this state of things is to make the medical profession a conspiracy to hide its own shortcomings. No doubt the same may be said of all professions. They are all conspiracies against the laity; and I do not suggest that the medical conspiracy is either better or worse than the military conspiracy, the legal conspiracy, the sacerdotal conspiracy, the pedagogic conspiracy, the royal and aristocratic conspiracy, the literary and artistic conspiracy, and the innumerable industrial, commercial, and financial conspiracies, from the trade unions to the great exchanges, which make up the huge conflict which we call society.” Taken from The Doctor’s Dilemma by George Bernard Shaw published by Penguin, 1946.

5 Extract taken from Cargo Cult Science by Richard Feynman. Adapted from the Caltech commencement address given in 1974.

6 Ibid.

7 “Education watchdog Ofsted wants to toughen the language of inspections in England – changing the “satisfactory” rating to “requires improvement”.

“Ofsted’s chief inspector, Sir Michael Wilshaw, wants to send a message that “satisfactory” is now unsatisfactory and that more schools should be pushing for the higher rating of “good”.”

From a BBC news article entitled “Ofsted plans to scrap ‘satisfactory’ label for schools”, written by Sean Coughlan, published January 17, 2012.

8 From an article entitled “Science exams easier, says Ofqual” published by The Independent on May 1, 2012.

9 For those unfamiliar with the mockumentary This is Spinal Tap in which the eponymous British rock group are on tour in America to promote their latest album. At one point the band’s lead guitarist Nigel Tufnel (played by Christopher Guest) is showing the fictional maker of the documentary Marty Di Bergi (played by Rob Reiner) his collection of instruments. When Tufnel shows Di Bergi one of his amplifiers that has a knob which goes up to eleven, Di Bergi asks him, “Why don’t you just make ten louder and make ten be the top number and make that a little louder?” Tufnel’s baffled reply is: “These go to eleven.”

Incidentally, anyone who has ever used BBC iplayer will be familiar with a digital homage to Tufnel’s celebrated amp. In truth, it’s one of those jokes that wears thin so quickly, you almost immediately forget it was ever a joke to begin with.

Leave a comment

Filed under analysis & opinion, « finishing the rat race », education

the unreal thing

The following article is Chapter Eight of a book entitled Finishing The Rat Race which I am posting chapter by chapter throughout this year. Since blog posts are stacked in a reverse time sequence (always with the latest at the top), I have decided that the best approach is to post the chapters in reverse order.

All previously uploaded chapters are available (in sequence) by following the link above or from category link in the main menu, where you will also find a brief introductory article about the book itself and why I started writing it.


Advertising is the rattling stick inside a swill bucket”

George Orwell


“Take a card, any card, it’s your choice… but don’t let me see what it is.” The magician fans the cards flamboyantly. We know it’s a trick of course. “Three of Clubs,” he tells us. We shake our heads dismissively – after all, we’re part of the act. The magician seems momentarily perplexed. “Do you have anything in your jacket pocket?” he asks as if desperately trying to turn our attention away from his apparent failure. We feel inside and find a sealed envelope. It’s the one we’d signed earlier in the performance. “Is the seal broken?” he asks, knowingly. “Open it – what’s inside?” We scratch our heads and quietly applaud. Somehow the magician has diverted our attention just long enough to construct the illusion of an altered reality. In truth his method was to “force” the card, and so his illusion relied on the simple fact that we really hadn’t a free choice at any stage. But we applaud because we admire his harmless deception. It amuses us to be deceived once in a while.


I saw an advert the other day. It read “Say No to No” which is the kind of quasi-Zen mumbo-jumbo that advertising executives get paid a small fortune to write. What was the effect of that advertisement? Well, it had suddenly interrupted my original train of thought. I’d probably been looking for the cigarette lighter or wondering how the living room table was so heaped up in junk again, but now I was reading on about how negativity gets in the way of progress. And which company, I kept wondering as I’d read down, would attach themselves to such a manifestly new age positive-thinking banner? I read on and came to examples of human achievements that left to the nay-sayers could never have happened:

“Yes, continents have been found…”, it read.

Found? By Columbus in 1492, presumably, and then Australia by James Cook. And no human had set eyes on them before? Obviously this is a rhetorical question. I read on…

“Yes, men have played golf on the moon…”

American men to be more precise. And it was indeed an incredible and truly awesome achievement – not the golf, but the travelling to the moon. When it comes to golf, there are obviously far superior facilities a lot closer to home. I read on…

“Yes, straw is being turned into biofuel to power cars…”

Well, hardly in the same league as exploration to such distant lands, but finally some inkling to where they were leading me…

I studied the picture more carefully. The words “Say no to no” are in thick capitals near the top of a blackboard already filled with images of progress and science – molecular structures, conical sections, a diagram showing a spherical co-ordinate system, graphs, line drawings of electron orbits and DNA, of animals and a ship and of course the ubiquitous pie-chart. A girl, her long straw-blond hair tied back into a pony-tail, and wearing a bright red tank top, has her back turned toward to us. She is reaching high, almost on tip-toe, into the black and white and adding the upward flourish of a spiral. Perhaps I was looking at one of those recruitment adverts for teaching, yet something told me otherwise…

And there it was – I’d found it at last – deliberately placed outside the main frame of the picture; a small, emblematic containment for all that progress: a remote, red and yellow scollop shell. The message was far from loud, but that was the point. And once spotted it was very clear, yet it had been intentionally delivered at a subliminal level – out of picture, unobtrusive, easily missed. Its instruction surreptitious and beyond the margins. Why? Because they wanted me to attach the ideas of positivity and progress to the symbol of a multinational oil corporation just as surely as Pavlov’s dogs associated lunch with the ringing of their owner’s bell. They wanted me to feel good things the next time I saw the scollop and to never even think about why.1


Advertising is simply another act of illusion and as with the performing stage magician, the audience is well aware that they are being tricked. But in advertising the illusion runs deeper, so that aside from the obvious aim of persuading us to buy Coke instead of Pepsi or whatever, it very often constructs a host of other frauds. Take again the advert mentioned above as an example, with the girl reaching up on tip-toe. Here nothing is accidental, with all parts and relationships operating together to reinforce our idea of progress as a constant striving toward a better world, whilst in the background, it only quietly dismisses any “nay-sayers” who disagree. Like many predators, advertisers work by stealth, often, as here, offering glimpses of Utopia, or of wonderful and perpetual advancement, to draw us on and in. The carrot on a stick swinging endlessly before the eyes of the befuddled donkey.

But then, on other occasions, they will take a different tack, and get out a proper stick. They’ll make us uneasy about our looks, or our lack of social status, before offering a quick fix for these problems so frequently of their own devising. There are many ways to ring our bells: both carrots and sticks are equally effective.

And then everyone says this: “Adverts don’t work on me.” So these companies spend literally billions of pounds and dollars on refining their illusions, posting them up all across our cities and towns, filling our airwaves with their jingles and sound-bites, not to mention the ever-widening device of corporate sponsorship, and yet still this remains as our self-deluding armour against such unending and ever more sophisticated assaults. I’ll bet you could find more people who’d say David Copperfield can really fly than would actually admit to being significantly influenced by advertising.


There probably never was a time when advertising was just that: a way to make products and services more widely or publicly known about. In such a time, adverts would have just showed pictures of the product and a simple description of its uses and/or advantages. “This is the night mail crossing the border…” – that sort of thing.

Though, of course, here immediately is a bad example, because the famous post office film is not only reminding us of what a jolly useful and efficient service our mail delivery is, but how wonderfully hard the GPO work whilst the rest of us are asleep. So on this different level Auden’s famous homage is a feel good thing, encouraging us to connect our good feelings to the postal service; it is an early example of public relations although still harmless enough in its quiet way.

But audiences get wise, or so we like to imagine, and so today’s advertisers have had to up the ante too. Gone are the days of telling you how to have “whiter whites” or advising everyone (with only a hint of surrealism) to “go to work on an egg”. Nowadays you’re far more likely to choose to eat a certain chewy stick because “it’s a bit of an animal” (without even noticing the entirely subliminal reference to your feelings about being carnivorous) or drink a can of soft drink because “image is nothing” (which presumes a ridiculous double-think on the part of the targeted purchaser). And where once a famous Irish beverage was just “good for you”, now it’s better because it comes “to those who wait”. Here you’re asked to make an investment in the form of time; an investment that is intended to add personal value to the brand.

Adverts are loaded with these and other sorts of psychological devices – cunningly latent messages or else entertaining ways of forging brand loyalty. They prey on the fact that we are emotional beings. They use tricks to bypass our rational centres, intending to hard-wire the image of their products to our feelings of well-being, happiness, contentment, success, or more simply, the image we have of ourselves. They use special words. LOVE for instance. Just see how many adverts say “you’ll love it”, “kids love it”, “dogs love it”, “we love it”, and so on and so on…. one I saw recently for condoms said simply “love sex” – talk about a double whammy!

Advertisers also like to scare us. When they are not showing us washing lines drying over the Fields of Elysium, or happy pals sharing time with packets of corn snacks, or elegant cars effortlessly gliding down open highways; they are constructing worlds of sinister dangers. Germs on every surface, and even in “those hard to reach places”. Threats from every direction, from falling trees to falling interest rates. I once saw an TV advert that showed a man desperately running from a massive and menacing fracture. It was a crack that seemed to be ripping through the very fabric of space and time, an existential terror relentlessly chasing after him through some post-apocalyptic nightmare. After a minute or so the threat abated and a solution was offered. Get your windscreen checked, it calmly advised.

And the government get in on this too. Watch out, watch out, there a thief about! Just say no to drugs! Sex is fun, but take precautions and don’t die of ignorance! In these ways, they ramp up fears of the real dangers we face, whilst also inculcating a sense of trust in the powers that be. The world is a perilous and unjust place, they say (which is true); fortunately, we are here to help you. Trust us to guide you. Obey our instructions. To protect you and your loved ones. To help you to realise your dreams. Together, we will make the world a fairer place. The constant PR refrains: “Believe”, “Belong”, “Trust”, and more recently, “Hope and Change”. O, ring out those bells!


Right now, there’s something refreshingly honest about smoking. Those of us who refuse or are unable to quit are left under absolutely no illusions about our little cancer sticks. We know perfectly well that each drag is bringing the grave that little bit closer. And it’s certainly not cool to smoke. Our clothes stink, our breath stinks, and stinking, we huddle outdoors, rain or shine, cluttering up the office doorways with our toxic fumes and heaps of fag-ends. But it wasn’t always so. Smoking had its golden age. A time when cigarettes were an accoutrement to style and when sharing a fag with a dame was nearly as great as sex.2 During this period, the tobacco industry invested a small fortune in maintaining their myth. They paid to lobby politicians, they made funds available for favourable medical research, and perhaps most significantly of all, they hired the best PR man in the business.

It can be fun to speculate on who were the most influential figures in history. Who would we wish to include? Great statesmen, formidable warriors, innovators, engineers, scientists and artists, when lists are polled for, the public generally take their pick from these, chucking in the odd saint or celebrity just for good measure. They choose between Churchill, Washington, Alexander the Great, Thomas Edison, and Albert Einstein, and if the criteria are widened to include villains as well as heroes, plump for Adolf Hitler, Mao Tse Tong, and Joseph Stalin. A selection, if you like, of the stars of the show. But what about people whose work involves them behind the scenes? What of those whose greater skill was to remain invisible or simply unnoticed? Edward Bernays was just such a man.


To say that Bernays was a great PR man is to do him a considerable disservice, for Bernays, who happened to also be a nephew of no lesser light than Sigmund Freud, is nowadays regarded as the father of modern PR. He wrote the book. Rather candidly he entitled it simply Propaganda – the word deriving from the Latin for “propagation” was less sullied back in 1928. In the opening chapter Bernays lays out the situation as he sees it:

“The conscious and intelligent manipulation of the organized habits and opinions of the masses is an important element in democratic society. Those who manipulate this unseen mechanism of society constitute an invisible government which is the true ruling power of our country.”

But Bernays is not warning us here, far from it. This is merely the way the world works, spinning along in a fashion that Bernays regards are both inevitable and to a great extent desirable. Better an orderly world of unseen manipulation than a world of ungovernable chaos. And it’s this point which he makes perfectly explicit in the very next paragraph:

“We are governed, our minds are molded, our tastes formed, our ideas suggested, largely by men we have never heard of. This is a logical result of the way in which our democratic society is organized. Vast numbers of human beings must cooperate in this manner if they are to live together as a smoothly functioning society.”3

We should perhaps not be surprised to learn then that Bernays’ book was one that didn’t make it onto the bonfires of the Third Reich. Instead, Joseph Goebbels publicly praised Bernays’ work as especially influential, saying that it had formed the blueprint for his own Nazi propaganda machine. Certainly, it is a very practical guide. It delves into a great many areas and asks important questions. One of the most significant questions it asks goes as follows:

“If we understand the mechanism and motives of the group mind, is it not possible to control and regiment the masses according to our will without their knowing it?”4

And the answer, as Bernays went on to prove with his amazing success in promoting everything from bacon and eggs to soap powder and political candidates, was HELL YES!

Working for the American Tobacco Company, Bernays had even piggy-backed a ride on the women’s rights movement. Offering encouragement to the fairer sex, for whom smoking in public was still very much a taboo, to keep on lighting their “Torches of Freedom.” Not that any similar strategy could work today obviously… well, not unless those torches were organically-grown by fair-trade tobacco farmers and rolled in chlorine-free paper supplied by sustainable forests, or whatever.

Bernays was the great promoter, perhaps the greatest, and he was keen to promote his own product, modern advertising, or as he called it propaganda, above all else. For Bernays, just as for his acolyte Joseph Goebbels, the future was propaganda:

“Propaganda will never die out. Intelligent men must realize that propaganda is the modern instrument by which they can fight for productive ends and help to bring order out of chaos.”5


Following Bernays, advertising no longer stops at breakfast cereals, toothpaste and petrochemical companies, having extended its parasitic tendrils throughout all areas of life, so that image becomes everything. Newspapers and magazines are glossier than ever. They radiate forth into the empty void of secular consumerist existence, visions of earthly fulfilment that can be bought (at preferential interest rates) – holidays, home improvements, house moves (especially abroad), fast cars, and millionaire lifestyles.

They tell us what is right to think about: beauty, health, fashion and that oh-so elusive of attributes, style. They tell us “how to get on”. They tell us what’s worth worrying about. DO worry about your wrinkles. DO worry about your waistline. DO worry about your split-ends. DO WORRY – because you’re worth it! Just as importantly we get to learn what is worth thinking about: success, fame and glamour, which when multiplied together make celebrity. Celebrity: from the Latin celebrare meaning to celebrate, or to honour. So whereas the ancients believed that the fixed and eternal heavenly stars were gods, we instead are sold a parallel myth revolving around “the stars of today”.

But newspapers and magazines are nothing, for their influence pales into insignificance when set in comparison to that flickering blue screen in the corner of the living room. It is our gateway to another world, a parallel dimension, where we are welcomed back each day by our virtual friends. It is a fire to warm us. A shadow-play of mesmerising potency. And here, the ever-tantalising jam of tomorrow has finally slopped over from its earlier containment within commercial breaks, to become what is now a mainstay for entire broadcasting schedules. Carrots and sticks for us to nod along to, 24/7, and three hundred and sixty-five days of the year.

It’s not even that all television is bad. Some is excellent. I would cite as an exemplar the consistently superior content of BBC wildlife documentaries, which far exceed any comparable alternative whether offered by books, radio, or at the cinema. Here is television at the very pinnacle of its achievement.

A great deal on television is produced just to amuse us, or amaze us, and occasionally, actually to inform us, and much of this merits credit too, but I do not feel it necessary to waste time pushing an open door. We all know that television can sometimes be marvellous. But we also know that most of it is junk. Junk that, with the influx of multiple digital channels, is spread ever more thinly and widely. In a modern world television certainly has its place, but we will do well never to forget its unprecedented powers:

“Right now there is an entire generation that never knew anything that didn’t come out of this tube. This tube is the gospel, the ultimate revelation. This tube can make or break president’s hopes… This tube is the most awesome God-damn force in the whole godless world, and woe is us if ever it falls in the hands of the wrong people…

And when the twelfth largest company in the world controls the most awesome God-damned propaganda force in the whole godless world, who knows what shit will be peddled for truth on this network. So you listen to me – Listen to me – Television is not the truth. Television’s a god-damned amusement park…

We’re in the boredom killing business… But you people sit there day after day, night after night, all ages, colours, creeds – We’re all you know – You’re beginning to believe the illusions we’re spinning here. You’re beginning to think that the tube is reality and that your own lives are unreal. You’ll do whatever the tube tells you. You’ll dress like the tube, you’ll eat like the tube, you’ll raise your children like the tube. You even think like the tube. This is mass madness. You maniacs! In God’s name, you people are the real thing – we are the illusion.”

Of course, if you’ve seen the film Network, from which this extraordinary rant is taken, then you’ll also be aware that these are the words of a madman!6

At the top of the chapter I quoted Orwell’s no-nonsense assessment of advertising, and advertising is indeed as he describes it: the rattling stick eliciting the same Pavlovian response in the pigs, as advertising executives wish to implant in our human minds. Their main intent to push their client’s products by making us salivate with desire. This was no different in Orwell’s time. Whilst advertising’s still more ugly parent, propaganda, has always aimed to change minds more fundamentally. It treats ideas as products and sells them to us. But the techniques in both advertising and propaganda have come a long way since Orwell’s time.

This power to propagandise has grown in large part because of television. The blue screen softly flickering away in the corner of every living room having opened up a possibility for thousands of ‘messages’ each day to be implanted and reinforced over and over. Unconsciously absorbed instructions to think in preformed patterns being precisely what Aldous Huxley thought would be needed if ever the seething and disorderly masses of any ordinary human population might be replaced by the zombie castes of his futuristic vision Brave New World. “Sixty-two thousand four hundred repetitions make one truth”, he wrote.7

Which is a joke, but like so much in Huxley’s work, a joke with very serious intent. Huxley’s vision of a future dystopia being subtler in ways to Orwell’s own masterpiece Nineteen Eighty-Four, not least because the mechanisms of mind control are wholly insidious. Huxley showing how you don’t have to beat people into submission in order to make them submit. Yet even Huxley never envisaged a propaganda system as pervasive and powerful as television has eventually turned out to be.


Advertising involves “the art of deception” and it has never been more artful than it is today… sly, crafty, cunning, scheming, devious, sneaky, and totally calculating. However, it is increasingly artful in that other sense too: being achieved with ever greater creative skill. Indeed, the top commercials now cost more than many feature films, and, aside from paying small fortunes for celebrity endorsement, the makers of our grandest and most epic commercials take extraordinary pains to get the details right.

Engineered to push the buttons of a meticulously studied segment of the population, niche marketing techniques ensure precise targeting with optimum impact. Every image, sound and edit honed, because time is money when you’re condensing your ‘message’ into thirty seconds. It is perhaps not surprising therefore that these commercial ‘haikus’ as regarded by some as the works of art of our own times. A view Andy Warhol (himself a former ‘commercial artist’) espoused and helped promote – though mostly he made his fortune espousing and promoting his own brand: a brand called Andy Warhol.

Warhol wrote that:

“The most beautiful thing in Tokyo is McDonald’s. The most beautiful thing in Stockholm is McDonald’s. The most beautiful thing in Florence is McDonald’s. Peking and Moscow don’t have anything beautiful yet.”8

Russian composer Igor Stravinsky is credited with a far better joke, having once remarked that “lesser artists borrow, but great artists steal”. As with Warhol’s quip, it fits its author well. Stravinsky here downplaying his unrivalled talent for pastiche, whereas Warhol could never resist hiding his gift for nihilism in plain sight.

But actually, advertising isn’t art at all, of course. Do I need to continue? It is a bloodless imitation that neither borrows nor steals, to go back to Stravinsky’s aphorism, but directly counterfeits. Feigning beauty and faking truth is all it knows, with a passing interest in the first in so far as it is saleable, and a pathological aversion to the second, since truth is its mortal enemy.

For if selling us what we least require and never thought we desired is advertising’s everyday achievement (and it is), then pushing products and ideas that will in reality make our lives more miserable or do us harm is its finest accomplishment. And the real thing? Like the stage magician, this is what the admen assiduously divert your attention away from.

Which brings me a story. A real story. Something that happened as I was driving to work one dark, dank February morning. A small thing but one that briefly thrilled and delighted me.

It was at the end of Corporation Street, fittingly enough I thought, where someone had summoned the courage to take direct action. Across the glowing portrait of a diligently air-brushed model were the words: “She’s not real. You are beautiful.”

That some anonymous stranger had dared to write such a defiant and generous disclaimer touched me. But it didn’t end there. This person, or persons unknown, had systematically defaced all three of the facing billboards, saving the best for last. It was for one of those ‘messages’ that is determined to scare some back into line, whilst making others feel smug with a glow of compliant superiority. It read: “14 households on Primrose Street do not have a TV licence” (or words to that effect).

The threat, though implicit, was hardly veiled. In Britain, more than a hundred thousand people ever year are tried and convicted for not having a TV licence. Some are actually jailed.9 But now this message had a graffiti-ed punchline which again brought home the hidden ‘message’ perpetuated by all of advertising. The spray-canned response read simply: “perhaps they’ve got a life instead.” A genuine choice the admen wouldn’t want you to consider. Not buying into things isn’t an option they can ever promote.

To add my own disclaimer, I in no way wish to encourage and nor do I endorse further acts of criminal damage – that said, here is a different piece of graffiti (or street art – you decide) that I happen to walk past on my way into work. In a less confrontational way, it too has taken advantage of an old billboard space:

the best things in life

Next chapter…


Addendum: a modest proposal

We are all living under a persistent and dense smog of propaganda (to give advertising and PR its unadorned and original name). Not only our product preferences and brand loyalties, but our entire Weltanschauung10 fashioned and refashioned thanks to a perpetual barrage of lies. Fun-sized lies. Lies that amuse and entertain. Lies that ingratiate themselves with fake smiles and seductive whispers. And lies that hector and pester us, re-enforcing our old neuroses and generating brand new ones. These lies play over and over ad nauseam.

Ad nauseam, the sickness of advertising, is a man-made pandemic, with modern commercials selling not simply products per se, but “lifestyles”. And think about that for a moment. Off-the-shelf ideals and coffee table opinions that are likewise custom-made. Beliefs to complement your colour-coordinated upholstery, your sensible life insurance policy, your zesty soap and fresh-tasting, stripy toothpaste.

Thanks to television, we inhale this new opium of the people all day long and few (if any) are immune to its intoxication, but then advertising operates at a societal level too – since by disorientating individuals, society as a whole becomes more vulnerable to the predatory needs of corporations. So cuddling up to the box and laughing along to the latest blockbuster commercial on the grounds that “adverts don’t affect me” just makes our own delusion complete.

I might have ended on a lighter note, but instead I’ll hand over to the late Bill Hicks at his acrimonious best (and apologises for his foul and abusive language, but unfortunately here it is fully warranted):

“By the way, if anyone here is in marketing or advertising kill yourselves…”

Bill pauses to absorb any cautious laughter, then quietly continues: “Just a thought… I’m just trying to plant some seeds. Maybe, maybe one day they’ll take root… I don’t know, you try, you do what you can…”

Still scattering handfuls of imaginary seeds, but now sotto voce for suggestive effect: “Kill yourselves…”

Another pause and then completely matter of fact. “Seriously though – if you are – do!”

And now Bill gets properly down to business: “Ahhh – No really – There’s no rationalisation for what you do and you are Satan’s little helpers okay… Kill yourselves. Seriously. You are the ruiners of all things good. Seriously. No, No, this is not a joke… Ha,ha, there’s going to be a joke coming… There’s no fucking joke coming! You are Satan’s spawn filling the world with bile and garbage. You are fucked and you are fucking us – Kill yourselves – It’s the only way to save your fucking soul – kill yourself…”

Then he comes to the crux of the matter: “I know what all you marketing people are thinking right now too: ‘Oh, you know what Bill’s doing. He’s going for that anti-marketing dollar. That’s a good market. He’s smart…’ – Oh Man! I’m not doing that! You fucking evil scumbags! – ‘You know what Bill’s doing now. He’s going for the righteous indignation dollar. That’s a big dollar. Lot of people are feeling that indignation. We’ve done research – huge market! He’s doing a good thing.’ – God damn it! I’m not doing that you scumbags…! Quit putting the dollar sign on every fucking thing on this planet!”

If we are ever to break free from the mind-forged manacles of the advertising industry then we might consider the option of broadcasting Bill Hicks’ rant unabridged during every commercial break on every TV channel on earth for at least a year – the obscenities bleeped out in broadcasts before the watershed!

While we’re about it, we will need a screening prior to every movie (during the commercial slots obviously) as well as key phrases rehashed into jingles and those same sound bites written up in boldface and plastered across every available billboard. Now, if you think this would be altogether too much of an assault on our delicate senses then please remember that is precisely what the dear old advertising industry does day-in and day-out. So wouldn’t it would fun to turn the tables on those in the business of deceit? And not simply to give them a dose of their own snake oil, but to shock us all with repeated jolts of truth instead.


Please note that for the purposes of ‘publishing’ here I have taken advantage of the option to incorporate hypertext links and embed videos – in order to distinguish additional commentary from the original text all newly incorporated text has been italised.


1 Incidentally, my young nephew had added a few scribbles of his own to this advertisement and it is interesting to note where he directed his pen marks, five places in all: one over each of the girls hands, one on the back of her head and another on her ponytail. And his only scribble that was not on the girl was on top of the scollop. Bullseye!

2 Of course in Hollywood films of a bygone age when censorship was strict, sharing a fag was also used as a metaphor for sex itself.

3 Taken from the opening to Chapter 1 entitled “Organising Chaos” of Propaganda, by Edward Bernays (1928).

4 Ibid. Chapter 4, “The psychology of Public Relations”

5 Ibid. Chapter 11, “The mechanics of propaganda”

6 “I’m as mad as hell, and I’m not going to take this anymore!” These are the words of anti-corporate evangelist Howard Beale, taken from the film Network (1976). A satire about a fictional television network called Union Broadcasting System (UBS), with its unscrupulous approach to raising the ratings, Network was written by Paddy Chayefsky and directed by Sidney Lumet. Most memorably, it features an Oscar-winning performance by actor the Peter Finch, playing the part of disaffected news anchor Howard Beale. Beale, having threatened to commit suicide live on air, is given his own show. Billed as “the mad prophet”, he steals the opportunity to angrily preach against what he sees as the corporate takeover of the world, and steadily his show gathers the largest audience on television. The consequences are, of course, inevitable.

7 “One hundred repetitions three nights a week for four years, thought Bernard Marx, who was a specialist on hypnopædia. Sixty-two thousand four hundred repetitions make one truth. Idiots!” From Chapter 3 of Brave New World by Aldous Huxley, published in 1932. 

8 Quote taken from Chapter 4 “Beauty” of The Philosophy of Andy Warhol: (From A to B and Back Again), published in 1975. 

9 “According to the most recent figures, about 70 people a year are jailed for TV licence fee offences. But the scale of prosecutions for licence fee evasion is far higher and now accounts for one in nine of all Magistrates Court cases. More than 180,000 people – almost 3,500 a week – appeared before the Magistrates Courts in 2012, accused of watching television without a valid licence in, with 155,000 being convicted and fined.”

From an article entitled ‘Dodging TV licence will not be a crime’ written by Tim Ross, published in The Telegraph on March 7, 2014.

10 Weltanschauung: a particular philosophy or view of life; the world view of an individual or group.

Leave a comment

Filed under analysis & opinion, « finishing the rat race »

the price of everything

The following article is Chapter Nine of a book entitled Finishing The Rat Race which I am posting chapter by chapter throughout this year. Since blog posts are stacked in a reverse time sequence (always with the latest at the top), I have decided that the best approach is to post the chapters in reverse order.

All previously uploaded chapters are available (in sequence) by following the link above or from category link in the main menu, where you will also find a brief introductory article about the book itself and why I started writing it.


When the accumulation of wealth is no longer of high social importance, there will be great changes in the code of morals. We shall be able to rid ourselves of many of the pseudo-moral principles which have hag-ridden us for two hundred years, by which we have exalted some of the most distasteful of human qualities into the position of the highest virtues. We shall be able to afford to dare to assess the money-motive at its true value. The love of money as a possession — as distinguished from the love of money as a means to the enjoyments and realities of life — will be recognised for what it is, a somewhat disgusting morbidity, one of those semi-criminal, semi-pathological propensities which one hands over with a shudder to the specialists in mental disease…”

John Maynard Keynes 1


Have you ever wondered what it’s like to be rich? Here I don’t just mean well-off, with a paltry few tens of millions in the bank, I mean proper rich – megabucks! So much money that, as I heard one comedian put it (aiming his joke squarely at the world’s richest entrepreneur), if Bill Gates were to stuff all his cash under the mattress, then due to interest alone, if he fell out of bed he’d never hit the ground!

I suppose what I’m wondering is this – and perhaps you’ve found yourself thinking along similar lines – why are these super-rich guys always so intent on accruing ever greater wealth when they already possess more than enough funds to guarantee the needs of a small country. Think about it this way: Gates and the others are, barring a few very necessary legal constraints, completely at liberty to do whatever they choose at every moment of every day. They can eat the best food, drink the most delicious vintage wines, smoke the finest cigars, play golf morning, noon, and evening, and then after the sun goes down, and if it is their wont, have liaison with the most voluptuous women (or men) available. Quite literally, they have means to go anywhere and do everything to their heart’s content and all at a moment’s notice. Just imagine that. So why be bothering about sales at all? I mean wouldn’t you eventually get bored of simply accumulating more and more money when you’ve already got so much – and let’s face it, money itself is pretty boring stuff. So just what is it that keeps them all going after it? After all, there are only so many swimming pools, grand pianos, swimming pools in the shape of grand pianos, Aston Martins, Lear Jets, and acreages of real estate that one man (or woman) can profitably use (in the non-profit-making sense obviously). Economists would call this the law of diminishing marginal utility, although in this instance it is basic common sense.2

Presented with evidence of this kind, some will say that here is further proof of the essential greediness of human beings. That, as a species, we are simply never satisfied until we have the lot. Fine then, let us take on this modern variant of original sin, since it certainly holds more than a grain of truth. For the sake of argument, we might presume that all men and women are greedy to an almost limitless extent. That this is truly the natural order, from our conception having been evolutionarily programmed to grab as much as we can for ourselves – our most primeval reflex being to snatch.

So I shall not waste too much time here. Only to say that I do not find such unrestrained cupidity within the circles of people with whom I have chosen to associate, most being happy enough to share out the peanuts and fork out for the next round of beers, quite oblivious to outcomes in terms of commensurate returns. What comes around goes around… There is, of course, no doubting that most folks will, very naturally, if opportunity arises, take good advantage to feather their own nests. Making life a little more comfortable for themselves, and reserving the ample share of their fortune for their immediate family and closest friends. But then, why not…? Charity begins at home, right?

What most don’t do (at least in the circles I know best) is devote their whole lives to the narrow utilitarian project outlined above. And why? Because, though quite understandably, money and property are greatly prized assets, they offer lesser rewards than companionship and love. And, in any case, pure generosity is its own reward – and I do mean “is”, and not “has” or “brings” – the reward being an inseparable part of the act itself: a something received as it was given, like a hug, like a kiss. That said, if you still prefer to believe that we are all to a man, woman and child, innately and incurably selfish and greedy, then next time you take a look into the mirror, do consider those all-too beady eyes staring back. It’s very easy to generalise about mankind when you forget to count yourself in.

But if not intractably a part of human nature, then we must find other reasons to account for how our world is nevertheless so horribly disfigured by rampant and greedy exploitation. For if greed is not an inherently human trait, and here I mean greed with a capital Grrr, then this monomaniacal obsession is all too frequently acquired, especially in those who approach the top of the greasy pole. There is an obvious circularity in this, of course. That those whose progress has depended upon making a buck, very often become addicted. As money-junkies, they, like other addicts, then prioritise their own fix above all else. Whether or not these types are congenitally predisposed to becoming excessively greedy, we have no way of knowing. What we can be certain of is this: that by virtue of having acquired such great wealth, they disproportionately shape the environment they and we live in. So they are not merely money-junkies, but also money-pushers. If you’re not a money-junkie then you don’t know what you’re missing. There’s nothing new in this. This is the way the world has been for many centuries, and perhaps ever since money was first invented.

So here’s Oscar Wilde addressing the same questions about money and our unhealthy relationship to it; his thoughts leaping more than a century, during which time very little has apparently changed:

“In a community like ours, where property confers immense distinction, social position, honour, respect, titles, and other pleasant things of this kind, man, being naturally ambitious, makes it his aim to accumulate this property, and goes on wearily and tediously accumulating it long after he has got far more than he wants, or can use, or enjoy, or perhaps even know of. Man will kill himself by overwork in order to secure property, and really, considering the enormous advantages that property brings, one is hardly surprised. One’s regret is that society should be constructed on such a basis that man has been forced into a groove in which he cannot freely develop what is wonderful, and fascinating, and delightful in him – in which, in fact, he misses the true pleasure of joy and living.”3

Embedded below is a recent interview [from December 2013] Pulitzer Prize-winning journalist Chris Hedges gave on “The Real News” in which he talked about – based to a large extent on his own personal experience – how the super rich are isolated and disconnected from the rest of society. He explains how this creates a deluded sense of entitlement and a pathological callousness:


Isn’t money funny stuff! Funny peculiar, I mean. We just take it so much for granted, almost as though it were a natural substance (disappointingly, of course, it doesn’t actually grow on trees). But when we do think about it, money has far stranger properties than anything in the natural world. And our relationship to it is more peculiar than our relationship to almost anything else.

Money, that’s what I want… sang the Beatles on one of their less celebrated tracks. But the truth will out. So just why did the Beatles want money, and, for that matter, why do I, and why do you? It doesn’t work, you can’t eat it, and it’s not, of a rule, a thing of special beauty. Money is absolutely useless in fact, right until you decide to swap it for what you actually want.

Money can’t buy me love, true again, but it might buy me a chocolate bar. Because money is really just a tool, a technology: a highly specialised kind of lubricant, that enables people to exchange their goods and services with greater ease and flexibility. The adoption of a money system enabling levels of parity for otherwise complex exchanges to be quickly agreed and settled. The great thing about money being, to provide a concrete illustration, that although £1 of tinned herring is probably equivalent to about thirty seconds of emergency plumbing (if you’re lucky), you won’t require crates of herring to pay for the call-out. So far so simple.

Except wait. We all know how the price of herring can go up as well as down, and likewise for the price of emergency plumbers. So why such a dynamic relationship? Well, there’s “the market”, a price-fixing system that arises spontaneously, regulating the rates of exchange between goods and services on the basis of supply adjusting to match demand. Thus by a stroke of good fortune, we find that money is not merely a lubricant for exchange, but also regulatory of useful production and services. This, at least, is the (widely accepted) theory.

Prices rise and fall in accordance with demand. Things that are in short supply become expensive, things that are abundant are cheaper. This is basic economic theory and it means, amongst other things, that in every transaction the “real value” of your money is actually relative, for the simple reason that the amount required depends not only on what you’re after, but also upon whether or not other people are after the same kind of thing. Money then, in terms of its “real value” to any individual or group, is something that is constantly varying. We might call this “the relativity of money”.

One consequence of the relative nature of money, is that the useful value of money overall can also rise and fall. It is possible that wholesale, retail and labour costs can all more or less rise or fall together, although the general tendency, as we all know from experience, is for overall rising costs. Indeed such “inflation” is regarded as normal and expected, and, as a consequence, it comes to seem just as natural as money itself. Yet since you always need more and more money to buy the same things then the value of your money must, in some important way, be constantly falling. But just why does money as a whole lose its value in this way? What makes yesterday’s money worth less than today’s? Well it turns out that this is a huge question and one that economists have argued long and hard about.

One partial account of inflation goes as follows: businesses and people in business are constantly looking for a little bit more. For how else can they maximise profits? In direct consequence, we, as customers, necessarily require more dosh to pay for the same goods or services. But to enlarge our budget, this automatically requires a commensurate increase in income, which means successfully negotiating for a larger salary. In the bigger picture then, the businesses supplying our wants and needs, are now needing to cover their larger wage-bills, which means higher prices to compensate. So prices and incomes rise together, with money becoming worth less and less precisely because everyone is trying to accumulate more and more of it. This endless tail-chasing escalation, which is given the fancy title of “the price/wage spiral”, serves as an excellent example of why money is really very odd stuff indeed.

And what is money in any case? The first traders most likely exchanged shells, precious stones, or other baubles to aid in bartering, but then naturally enough, over time these exchanges would have been formalised, agreements arising with regards to which objects and materials were most acceptable as currency. The material that became most widely accepted was eventually, of course, gold. But why gold? Well, no one actually knows but we can make some educated guesses.

Firstly, gold is scarce, and it is also rare in other ways – for instance, having a unique and unusual colour, which just happens to correspond to the colour of the Sun. The fact that it is almost chemically inert and so doesn’t tarnish, means that it also shines eternally, and so again is like the Sun. Indeed, Aldous Huxley, in Heaven and Hell (his sequel to The Doors of Perception) points out that almost every substance that humans have ever regarded as valuable shares this property of shininess. To Huxley this is evidence that even money owes it origins, in part at least, to a common spiritual longing. Our wish to own a precious piece of paradise.

But back to more mundane matters, if gold (or any other substance) is chosen as your currency, then there arises another problem. How to guarantee the quantity and quality of the gold in circulation? For if gold is worth faking or adulterating then it’s certain that somebody will try cheating.

Well, one answer could be the adoption of some kind of official seal, a hallmark, and this solution leads, naturally enough, to the earliest forms of coinage. But then, if the coins are difficult to counterfeit, why bother to make them out of gold in the first place? Just the official seal would be enough to ensure authenticity. And why bother with metal, which is bulky and heavy. So again it’s an obvious and logical leap to begin producing paper banknotes. The value of these coins and banknotes, although far less intrinsically valuable in material terms than the gold they represent, is still backed by the promise that they are redeemable into gold. But hang on, what’s so special about the gold anyway (aside from its shininess). And doesn’t the gold, which is now locked up in bullion reserves, in fact have real uses of its own? And doesn’t this mean that the gold also has a monetary value? So why not cut loose from the circularity and admit that the value of money can exist entirely independent from the gold or from any other common standard. Indeed, why couldn’t the issuing authority, which might be a government but is more often a central bank, simply make up a “legal tender”4 with no intrinsic or directly correlated value whatsoever and issue that? Not that the money issued need even correspond to the amount of real coins or paper banknotes in circulation – most of the world’s money being bits and bytes, ones and zeroes, orbiting out in cyber-space. Which brings us to just how funny money has now become.

The Pound Sterling, the various dollars, the Euro and every major currency on Earth are, to apply the correct terminology, “fiat currencies”5 With fiat currencies there is no parity to the value of any other commodities and so they are, if you like, new forms of gold. As such, and given their shifting relative values, these new fiat currencies can also be traded as another kind of commodity. Money, in the form of currency, becoming an investment in itself. Money is strange stuff indeed.

Yet money also remains as an instrument. And we use this instrument to measure just about everything. To establish the value of raw materials and manufactured items. The value of land and, by extension, the value of the space it occupies. The value of labour, and thus a value on the time used. And, since works of art are also bought and sold, money is even applied as a measure of such absolutely intangible qualities as beauty.

So money is basically a universally adaptable gauge, and this is its great strength. It is perhaps the big reason why its invention gradually caught on in such a fundamental way. From humble trading token, money has risen to become a primary measure of all things. But remember, remember… Money, whether fiat currency or gold standard, can never be real in the same way as tins of herring and plumbers are real, and neither is “monetary value” an absolute and intrinsic property, but only ever relative and acquired. Money, we ought to constantly remind ourselves (since we clearly need reminding) is nothing without us or without our highly structured civilisation – intrinsically, it is worthless. It is very strange stuff.

Perhaps the future benchmark for money will no longer be gold but ‘virtual gold’ in the form of cryptocurrencies – bitcoin being currently the most well-known of these. One advocate of these alternatives to traditional forms of money is financial expert Max Keiser. On February 3rd 2014, he spoke with coder, hacker and cryptocurrency specialist Andreas Antonopoulos about the regulation of bitcoin transactions; the advent of bitcoin derivatives, which he believes these are less of a threat than ordinary derivatives (a subject I’m coming to next); the fact that unlike gold, cryptocurrencies can be ‘teleported’; and a future in which bitcoin is used widely by businesses as much as by individuals. He says that a time is coming when the prevalent misgivings and doubts about bitcoin and other cryptos have long since been forgotten. Is he right? I don’t know and remain highly skeptical, but I find the debate an interesting one:

Incidentally, there are less radical and more tangible alternatives to the currencies we now have in circulation. “Treasury notes” are one such alternative and these have historical precedence in the form of both the American “greenback” and the UK’s Bradbury Pound. To read more about this and also for links to campaigns to reintroduce them please read the addendum at the end of the chapter.


Little more than a century ago, and even in the richest corners of the world, there were no dependable mechanisms to safeguard against the vicissitudes of fortune. If you weren’t already poor and hungry (as most were), then you could rest assured that potential poverty and hunger were waiting just around the corner. Anyone with aspirations to scale the ladder to secure prosperity faced the almost insurmountable barriers of class and (a generally corresponding) lack of education. A lower class person of such ambitions would be very well aware that if they could step onto the ladder at all, there was very little in the way of protection to save them in the event of falling; errors of judgement or sheer misfortune resulting in almost certain and unmitigated personal disaster. This was the sorry situation for people at all levels of society aside from the highest echelons.

One tremendous advantage then, of living in a modern society, is that, aside from having slightly less restricted social mobility (not that we now live in the classless society we are told to believe in), there are basic safety nets in place, with additional protection that is optionally available. For those languishing at the bottom of the heap, there are the reliable though meagre alms provided through a welfare system, whilst for the ever-expanding middle classes there is plenty of extra cover in the form of saving schemes, pension schemes, and, in the event of the most capricious and/or calamitous of misfortunes, the ever-expanding option of insurance policies. If the Merchant of Venice had been set in today’s world then the audience would feel little sympathy for his predicament. Why had he ventured on such a risk in the first place, casting his fortune adrift on dangerous waters? Why hadn’t he protected his assets by seeking independent financial advice and taking out some preferential cover? It’s a duller story altogether.

Systems for insurance are essential in any progressive civilisation. Protection against theft, against damage caused by floods, fires and other agents of destruction, and against loss of life and earnings. Having insurance means that we can all relax a bit, quite a lot, in fact. But it also means that, alongside the usual commodities, there’s another less tangible factor to be costed and valued. That risk itself needs to be given a price, and that necessarily means speculating about the future.

Indeed, speculations about the future have become very much to the forefront of financial trading. As a consequence of this, at least in part, today’s financial traders have become accustomed to dealing in “commodities” that have no intrinsic use or value whatsoever. They might, for example, exchange government bonds for promises of debt repayment. Or, feeling a little more adventurous, they might speculate on the basis of future rates of foreign exchange, or in interest rates, or share prices, or rates of inflation, or in a multitude of other kinds of “underlying assets” (including that most changeable of underlying variables: the weather) by exchange of promissory notes known most commonly as “derivatives”, since they derive their value entirely on the basis of the future value of something else. And derivatives can be “structured” in any myriad of ways. Here are a just few you may have heard of :–

  • futures (or forwards) are contracts to buy or sell the “underlying asset” up until a future date on the basis of today’s price.
  • options allow the holder the right, without obligation (hence “option”), to buy (a “call option”) or to sell (a “put option”) the “underlying asset.”
  • swaps are contracts agreeing to exchange money up until a specified future date, based on the underlying value of exchange rates, interest rates, commodity prices, stocks, bonds, etc.

You name it: there are now paper promises for paper promises of every conceivable kind. Now the thing is that because you don’t need to own the “underlying asset” itself, there is no limit to the amounts of these paper promises that can be traded. Not that this is as novel as it may first appear.

Anyone who’s ever bought a lottery ticket has in effect speculated on a derivative, its value in this case being entirely dependent upon the random motion of coloured balls in a large transparent tumbler at an allocated future time. All betting works this way, and so all bets are familiar forms of derivatives. And then there are, if you like, negative bets. Bets you’d rather lose. For instance, £200 says my house will burn down this year, is presumably a bet you’d rather lose, but it is still a bet that many of us annually make with an insurance company. And general insurance policies are indeed another form of familiar derivative – they are in effect “put options”.

However there is one extremely important difference here between an ordinary insurance policy and a “put option” – in the case of the “put option”, you don’t actually need to own the “underlying asset”, which means, to draw an obvious comparison, you might take out house insurance on your neighbour’s property rather than your own. And if their house burns down, ah hum accidentally, of course, then good for you. Cash in your paper promise and buy a few more – who knows, perhaps your neighbour is also a terrible driver. There are almost numberless opportunities for insuring other people’s assets and with only the law preventing you, then why not change the law. Which is exactly what has happened, with some kinds of derivatives circumventing the law in precisely this way, and permitting profitable speculation on the basis of third party failures. When it comes to derivatives then, someone can always be making a profit come rain or shine, come boom or total financial meltdown.

But, why stop there? Especially when the next step is so obvious that it almost seems inevitable. Yes, why not trade in speculations on the future value of the derivatives themselves? After all, treating the derivative itself as an “underlying asset” opens the way for multiple higher order derivatives, creating with it, the opportunity for still more financial “products” to be traded. Sure, these “exotic financial instruments” quickly become so complex and convoluted that you literally need a degree in mathematics in order to begin to decipher them. Indeed those on the inside make use of what are called “the Greeks”, and “the Higher Order Greeks”, since valuation requires the application of complex mathematical formulas comprised of strings of Greek letters, the traders here fully aware, of course, that it’s all Greek to the rest of us. Never mind – ever more financial “products” means ever more trade, and that’s to the benefit of all, right…?

Deregulation of the markets – kicked off in Britain by the Thatcher government’s so-called “Big Bang” and simultaneously across the Atlantic through the laissez-faire of “Reagonomics”6 – both enabled and encouraged this giddying maelstrom, allowing in the process the banking and insurance firms, the stockbrokerage and hedge funds that make up today’s “finance industry” to become the single most important “wealth creator” in the Anglo-American world. Meanwhile, declines in manufacturing output in Britain and America meant both nations were becoming increasingly dependent on a sustained growth in the financial sector – with “derivatives” satisfying that requirement for growth by virtue of their seemingly unbound potential. Indeed, having risen to become by far the largest business sector simply in terms of profit-making, many of the largest banks and insurance groups had become “too big to fail”7. Failure leading potentially to national, if not international, economic ruin. Which is how the very systems that were supposedly designed to protect us, systems of insurance, have, whether by accident or design, left us more vulnerable than ever.

And then came the bombshell, as we learnt that the banks themselves were becoming bankrupt, having gambled their investments in the frenzy of deregulated speculation. Turns out that some of the money-men didn’t fully understand the complexity of their own systems; a few admitting with hindsight that they’d little more knowledge of what they were buying into than the rest of us. They’d “invested” because their competitors “invested”, and, given the ever-growing buoyancy of the markets at the time, not following suit would have left them at a competitive disadvantage. A desperate but strangely appropriate response to the demands of free market capitalism gone wild.


It is currently estimated that somewhere in the order of a quadrillion US dollars (yes, that’s with a qu-) has been staked on derivations of various kinds. Believe it or not, the precise figure is actually uncertain because many deals are brokered in private. In the jargon of the trade these are called “over the counter” derivatives, which is an odd choice of jargon when the only thing the average customer buys over the counter are drugs. Could it be that they’re unconsciously trying to tell us something again?

So just how big is one quadrillion dollars? Well, let’s begin with quadrillion. Quadrillion means a thousand trillion. Written at length it is one with a string of fifteen zeros. A number so humungous that it’s humanly impossible to properly comprehend: all comparisons fail. I read somewhere that if you took a quadrillion pound coins and put them side by side then they would stretch further than the edge of the solar system. The Voyager space programme was, of course, a much cheaper alternative. Or how about this: counting a number every second, it would take 32 million years to count up to a quadrillion… Now obviously that’s simply impossible – I mean just try saying “nine hundred and ninety-nine trillion, nine hundred and ninety-nine billion, nine hundred and ninety-nine million, nine hundred and ninety-nine thousand, nine hundred and ninety-nine” in the space of one second! You see it really doesn’t help to try to imagine any number as big as a quadrillion.

However, there are still useful ways to compare a quadrillion dollars. For instance, we can compare it against the entire world GDP which turns out to be a mere 60 trillion US dollars8. One quadrillion being nearly twenty times larger. Or we might compare it against the estimated monetary wealth of the whole world: about $75 trillion in real estate, and a further $100 trillion in world stock and bonds. So one quadrillion is a number far exceeding even the total monetary value of the entire world – material and immaterial! A little freaky to say the least! Especially when we discover that many of these derivatives are now considered to be “toxic assets”, which is a characteristically misleading way of saying they are worth nothing – yes, worthless assets! – whatever the hell that means!

So just like the Sorcerer’s Apprentice, it seems that the spell has gone out of control, and instead of these mysterious engines making new money out of old money, the system has created instead an enormous black hole of debt. A debt that we, the people, are now in the process of bailing out, with extremely painful consequences. Efforts to save us from a greater catastrophe having already forced the British and US governments to pump multiple hundreds of billions of public money into the coffers of the private banks. Yet the banks and the economy remain broken of course, because how is any debt larger than the monetary value of the entire world ever to be repaid?

Another tactic to halt descent into a full-blown economic meltdown has involved the issuance of additional fiat currency in both Britain and America; a “quantitative easing” designed to increase the supply of money by simply conjuring it up (a trick that fiat currency happily permits). Money may not grow on trees but it can most certainly be produced out of thin air. But here’s the rub. For in accordance with the most basic tenets of economic theory, whenever extra banknotes are introduced into circulation, the currency is correspondingly devalued. So you may be able to conjure money from thin air, but all economists will readily agree that you cannot conjure “real value”, meaning real purchasing power. Indeed this common mistake of confusing “nominal value” (i.e., the number of pounds written on the banknote) with “real value”, is actually given a name by economists. They call it: “the money illusion”. And it’s useful to remind ourselves again that money has only relative value.

To understand this, we might again consider money to be a commodity (which in part it is, traded on the currency markets). As such, and as with all other commodities, relative scarcity or abundance will alter its market value, and, in obedience to the law of supply and demand, more will automatically mean less. This is just as true for the value of money as it is for tins of herring, plumbers, scotch eggs and diamonds. So it seems that if too much of our quantitative is eased, then we’d better be prepared for a drastic rise in inflation, or much worse again, for hyperinflation. Printing too much money is how hyperinflation has always been caused.

Our future is bleak, they tell us. Our future is in the red. So much for security, so much for insurance. We’d apparently forgotten to beware of “the Greeks” and of the “higher order Greeks” when they’d first proffered gifts.


I said earlier, just in passing, that money is actually pretty boring stuff, and it is… Truly, madly and deeply boring! So when I hear on the news how “the markets” are hoping that the latest round of “quantitative easing” will enable governments to provide the necessary “fiscal stimulus”, I am barely even titillated. Whilst explanations, both in the popular press and supposedly more serious media, that like to describe such injections of new money as in some way analogous to filling up my car with imaginary petrol provide me only with a far, far more entertaining distraction: to wit, a magical car that runs on air.

But then, of course, money isn’t really stuff at all! More properly considered, money is perhaps a sort of proto-derivative, since its worth is evidently dependent upon something other than the paper it’s (increasingly not) written on. So what is it that money’s worth depends upon? What underlies money? Well, the accepted answer to this question is apparently that money is a “store of value”. Although this leads immediately to the obvious follow-up question: in this context, what precisely is the meaning of “value”? But, here again there is a problem, since “value”, although a keystone to economic thinking, has remained something of an enigma. Economists unable to agree upon any single definitive meaning.

Is “value” a determinant of usefulness? Or is it generated by the amount of effort required in the production of things? Or perhaps there is some other kind of innate economic worth? For instance in a thing’s scarcity. And can this worth be attributed at the individual level or only socially imputed?

There are a wide variety of definitions and explanations of “value”, that, being so foundational, have then encouraged the various branches of economic theory to diverge. And here is another important reason why economics is in no way equivalent to the physical sciences. Ask any physicist what energy is, and they will provide both an unambiguous definition and, no less importantly, offer established methods for measurement. Because of this, if ever one physicist talks to another physicist about energy (or any other physical quantity) they can be absolutely certain that they are talking about the same thing. Which is very certainly not the case when economists talk about “value”.

“A cynic is a man who knows the price of everything and the value of nothing,” said Oscar Wilde, distinguishing with playful wisdom the difference in human terms between “price” and “value”. The great pity is that the overwhelming majority of today’s economists have become so cynical – but then perhaps they always were.


As part of his on-going assault against religion, Richard Dawkins recently published a book called The God Delusion. It’s the old hobby-horse again; one that he shares with a great many millions of other broadly liberal, literate and intelligent people. That religion is an evil of which humanity must rid ourselves totally. And yes, much of religion has been dumb and dangerous, this I will very readily concede (and already have conceded in earlier chapters). But really and truly, is it “the God delusion” that we should be most concerned about in these torrid times? For regardless of Dawkins claims, it is quite evident that religion is a wounded animal, and for good or ill, the secular world is most certainly in the ascendant. Right throughout the world, aside from a few retreating pockets of resistance, faith in the old gods has been gravely shaken. It is not that human faith, by which I mean merely a belief and/or worship of something greater, is extinguished, for it never can be, but that it has been reattached to new idol-ologies. And in those parts of the world where the old religions have been most effectively disarmed or expelled, namely the West, one idol-ology above all others has gathered strength from Religion’s demise.

Richard Dawkins has said many times that instructing young children in religious obedience is a form of psychological child abuse and on this point I wholeheartedly support him. Children’s minds are naturally pliable for very sound developmental reasons. But is it less pernicious to fill their precious minds with boundless affection for let’s say Ronald McDonald? For this is merely one stark but obvious illustration of how a new fundamentalism has been inculcated in the young. Devotion to the brand. Love of corporations. Worship of the dollar and the pound.

This new kind of fundamentalism has long since swept across the world, but it is unusual, although not unique, in that it denies its own inherent religiosity whilst claiming to have no idols. This is the fundamentalism of free market neoliberal economics. The Father, Son and Holy Ghost having been forsaken, only to have been usurped by the IMF, the World Bank and the WTO. If you think I’m joking, or that this is mere hyperbole, then think again. When things are tough we no longer turn to the heavens, but instead ask what sacrifices can be made to “reassure the markets”. Sacrifices to make it rain money again.

By far and above, here is the most pernicious delusion of our age. And it has next to nothing to do with God, or Yahweh, or Allah, or even the Buddha. The prophets of our times talk of nothing besides profits or losses. They turn their eyes to the Dow Jones Index, trusting not in God, but only in money. So I call for Dawkins to leave aside his God delusion, for a moment, and pay a little attention to the rise and rise of “the money delusion”. If future historians reflect on our times, this is what they will see, and given the mess this “money delusion” is creating they will scratch their heads in disbelief and disgust.


I have already discussed the so-called “money illusion” – of mistaking nominal banknote value for real purchasing value – but this is merely one of many nested and interrelated illusions that make up “the money delusion”. Illusions that have become so ingrained within our permitted economic thinking that they are completely taken for granted.

Foundational is the belief that individuals always make rational choices. According to the definition of making rational choices, this requires that we all choose with consistency and always with the aim of choosing more over less. That a huge advertising industry now exists to tempt us into irrationality is never factored in. Nor are the other corrosive influences that so obviously deflect our rational intentions: the coercion of peer pressure, our widespread obsession with celebrities and celebrity endorsement, and that never-ending pseudo-scientific babble that fills up many of the remaining column inches and broadcast hours of our commercial media. We are always eager for the latest fashionable fads, and perhaps we always were. Yet this glaring fact, that people make wholly irrational choices time and again, whether due to innate human irrationality or by deliberate design, is of little concern to most economists. It is overlooked and omitted.

Likewise, a shared opinion has arisen under the name of neoliberalism that economics can itself be neutral, usefully shaping the world without the nuisance of having to rely on value judgements or needing any broader social agenda. If only individuals were left to make rational choices, as of course they do by definition, or so the idea goes, and the market could also be unshackled, then at last the people will be free to choose. Thus, goes the claim, individual freedom can only be guaranteed by having freedom within the marketplace. Freedom trickling down with the money it brings. “Wealth creation” alone must solve our problems by virtue of it being an unmitigated good.

Of course, back in the real world, one man’s timber very often involves the destruction of another man’s forest. Making profits from the sale of drugs, tobacco and alcohol has social consequences. Factories pollute. Wealth creation has its costs, which are very often hidden. There is, in other words, and more often than not, some direct negative impact on a third party, known to economists as “spillover” or “externalities”, that is difficult to quantify. Or we might say that “wealth creation” for some is rather likely therefore to lead to “illth creation” for others.

Illth creation? This was the term coined by romantic artist, critic and social reformer, John Ruskin, and first used in his influential critique of nineteenth century capitalism entitled Unto This Last. Ruskin had presumably never heard of “the trickle-down effect”:

“The whole question, therefore, respecting not only the advantage, but even the quantity, of national wealth, resolves itself finally into one of abstract justice. It is impossible to conclude, of any given mass of acquired wealth, merely by the fact of its existence, whether it signifies good or evil to the nation in the midst of which it exists. Its real value depends on the moral sign attached to it, just as sternly as that of a mathematical quantity depends on the algebraical sign attached to it. Any given accumulation of commercial wealth may be indicative, on the one hand, of faithful industries, progressive energies, and productive ingenuities: or, on the other, it may be indicative of mortal luxury, merciless tyranny, ruinous chicane.”9


We are in the habit of regarding all money as equal. Presuming that the pounds and pence which make up my own meagre savings are equivalent in some directly proportional manner to the billions owned by let’s say George Soros. A cursory consideration shows how this is laughable.

For instance, we might recall that on “Black Wednesday” in 1992, Soros single-handedly shook the British economy (although, the then-Chancellor of the Exchequer Norman Lamont was left to shoulder the blame)10. But to illustrate this point a little further, let me tell you about my own small venture into the property market.

Lucky enough to have been bequeathed a tidy though not considerable fortune, I recently decided to purchase a house to live in. The amount, although not inconsiderable by everyday standards (if compared say with the income and savings of Mr and Mrs Average), and very gratefully received, was barely sufficient to cover local house prices, except that I had one enormous advantage: I had cash, and cash is king.

For reasons of convenience, cash is worth significantly more than nominally equivalent amounts of borrowed money. In this instance I can estimate that it was probably worth a further 20–30%. Enough to buy a far nicer house than if I’d needed to see my bank manager. A bird in the hand…

Having more money also has other advantages. One very obvious example being that it enables bulk purchases, which being cheaper, again inflates its relative value. The rule in fact is perfectly straightforward: when it comes to money, more is always more, and in sufficient quantities, it is much, much more than that.

But then, of course, we have the market itself. The market that is supposedly free and thus equal. The reality being, however, that since money accumulates by virtue of attracting its own likeness, the leading players in the market, whether wealthy individuals or giant corporations, by wielding larger capital resources, can operate with an unassailable competitive advantage. These financial giants can and do stack the odds even higher in their favour by more indirect means, such as buying political influence with donations to campaign funds and by other insidious means such as lobbying – all of which is simply legally permitted bribery. The flaunted notion of a free market is therefore the biggest nonsense of all. There is no such thing as a free market: never has been and never will be.

The most ardent supporters of free market neoliberalism say that it is a non-normative system, which permits us finally to rid ourselves of disagreements over pesky value judgements. The truth, however, is very much simpler. By ignoring values, it becomes a system devoid of all moral underpinning. Being morally bankrupt, it is unscrupulous in the truest sense of the word.


If I had enough money and a whim, I might choose to buy all the plumbers and tins of herrings in Britain. Then, since money is (in part) a measure of scarcity, I could sell them back later with a sizeable mark-up. Too far-fetched? Well, perhaps, but only in my choice of commodity. The market in other commodities has without any question been cornered many times in the past. For instance, by the end of the 1970s, two brothers, Nelson Bunker and William Herbert Hunt, had accumulated and held what was then estimated to be one third of all the world’s silver. This led to serious problems both for high-street jewellers11 and for the economy more generally12, and as it happened, when the bubble burst on what became know as “Silver Thursday”, it also spelt trouble for the brothers’ own fortune. Fortunately for them, however, the situation was considered so serious that a consortium of banks came forward to help to bail them out13. They had lost, their fortune diminished, although by no means wiped out. As relatively small players they’d played too rough; meanwhile much larger players ensure that the markets are routinely rigged through such manufacture of scarcity. Going back as early as 1860, John Ruskin had already pointed out a different but closely-related deficiency in any market-driven capitalist system of trade:

“Take another example, more consistent with the ordinary course of affairs of trade. Suppose that three men, instead of two, formed the little isolated republic, and found themselves obliged to separate, in order to farm different pieces of land at some distance from each other along the coast: each estate furnishing a distinct kind of produce, and each more or less in need of the material raised on the other. Suppose that the third man, in order to save the time of all three, undertakes simply to superintend the transference of commodities from one farm to the other; on condition of receiving some sufficiently remunerative share of every parcel of goods conveyed, or of some other parcel received in exchange for it.

“If this carrier or messenger always brings to each estate, from the other, what is chiefly wanted, at the right time, the operations of the two farmers will go on prosperously, and the largest possible result in produce, or wealth, will be attained by the little community. But suppose no intercourse between the landowners is possible, except through the travelling agent; and that, after a time, this agent, watching the course of each man’s agriculture, keeps back the articles with which he has been entrusted until there comes a period of extreme necessity for them, on one side or other, and then exacts in exchange for them all that the distressed farmer can spare of other kinds of produce: it is easy to see that by ingeniously watching his opportunities, he might possess himself regularly of the greater part of the superfluous produce of the two estates, and at last, in some year of severest trial or scarcity, purchase both for himself and maintain the former proprietors thenceforward as his labourers or servants.”14

By restricting the choices of others, one’s power over them is increased, and it this that brings us to the real reason why money becomes such addiction, especially for those who already have more than they know what to do with. For truly the absolute bottom line is this: that money and power become almost inseparable unless somehow a separation can be enforced. And whilst wealth, especially when excessive, accumulates, as it almost invariably does, then along with it goes the accumulation of power. This is underlying and centralising mechanism has perhaps always operated at the heart of all civilisation. But even the power of money has its limits, as Ruskin points out:

“It has been shown that the chief value and virtue of money consists in its having power over human beings; that, without this power, large material possessions are useless, and to any person possessing such power, comparatively unnecessary. But power over human beings is attainable by other means than by money. As I said a few pages back, the money power is always imperfect and doubtful; there are many things which cannot be reached with it, others which cannot be retained by it. Many joys may be given to men which cannot be bought for gold, and many fidelities found in them which cannot be rewarded with it.

“Trite enough, – the reader thinks. Yes: but it is not so trite, – I wish it were, – that in this moral power, quite inscrutable and immeasurable though it be, there is a monetary value just as real as that represented by more ponderous currencies. A man’s hand may be full of invisible gold, and the wave of it, or the grasp, shall do more than another’s with a shower of bullion. This invisible gold, also, does not necessarily diminish in spending. Political economists will do well some day to take heed of it, though they cannot take measure.”15

Until such a time, every action and probable outcome must continue to be evaluated on the basis of strict cost and benefit estimates. Our “ponderous currencies” literally enabling a figure to be set against each human life – an application fraught with the most serious moral dilemmas and objections – and beyond even this, we have price tags for protecting (or else ruining) the natural environment all our lives depend upon. For only the market can secure our futures, optimally delivering us from evil, though inevitably it moves in mysterious ways. Which is how the whole world – land, water, air and every living organism – came to be priced and costed. Everything set against a notional scale that judges exclusively in terms of usefulness and availability, such is the madness of our money delusion.

We are reaching a crisis point. A thoroughgoing reappraisal of our financial systems, our economic orthodoxes, and our attitudes to money per se is desperately required. Our survival as a species may depend on it. Money ought to be our useful servant, but instead remains, at least for the vast majority, a terrible master. As a consequence, our real wealth has been too long overlooked. Time then for this genii called money to be forced back tight inside its bottle. Ceaselessly chasing its golden behind, and mistaking its tight fist for the judicious hand of God, is leading us ever further down the garden path. Further and further away from the land it promises.

Next chapter…


 Addendum: Q & A

Back in April 2012, I forwarded a draft of this chapter to friends in Spain (a nation already suffering under imposed “austerity measures”). They sent an extended reply which raised two interesting and important questions. Both questions along with my replies are offered below:

Q1: You seem to be saying that printing money (as the US and UK, who are in control of their own currency, are doing ) is as bad as dealing with the debt problem by means of austerity (the “Merkozy” approach). But the latter is surely definitely worse.

A. I think these are simply two sides of the same scam. The bankers create an enormous unpayable debt and then get governments to create new money to bail them out. This is sold to us as a way of bailing out a few chosen victims (Greece, Spain, Portugal, Ireland) although it simply means a huge transfer of wealth from public into private hands. To make that money useful to the bankers (and the rest of the ruling elite) ‘austerity measures’ are put in place which not only steal money off the average person but also permit the fire sale of national assets. Meanwhile, in Britain and America, the governments are helping to pay for these bailouts by creating money out of thin air, which means the real value of our money is reduced through inflation (effectively a hidden tax). If the money were invested in infrastructure or education or whatever, then this could potentially be a good thing (even though it still creates inflation), so certainly QE could have been beneficial but not when you use the money only to keep afloat a huge Ponzi scheme. But then you ask later…

Q2: ‘but how come the pound is high now and the euro low’

A. That’s a very good question and I won’t pretend that I understand this completely, but I gather there are plenty of ways for keeping currencies higher than they ought to be by manipulating the markets [incidentally, the Forex Scandal to manipulate and rig the daily foreign exchange rates did not come to light until Summer 2013]. The market is rigged in any case by virtue of the fact that the dollar remains the world’s reserve currency and that oil is traded entirely in dollars. But essentially what’s going on here is a huge currency war, and the euro is constantly under attack from speculators. I am fairly certain that the chickens will come home to roost sooner or later in America and Britain (and in Germany too), but meanwhile the governments simply go about cooking the books and telling us how inflation is only 4% or whatever when fuel prices, for instance, have rocketed during the past few years. In any case, we get ‘austerity’ too, not as hardline yet as the ‘austerity’ being imposed elsewhere, but it will come – of this I have no doubt. Either it will happen slowly, or worse, there will be a huge war and the ‘austerity’ will be brought into place to justify the expense of that. This is a deliberate attack by the bankers against the people of the world, and until the people of the world say that’s enough, and most of the debts are cancelled outright, I don’t see any way this can be reversed.


Another topic I briefly touched upon in the chapter above is the matter of inflation. What is it and what causes it? My answers were sketchy, in part, because I wished to avoid getting too bogged down in technicalities beyond my training. But this question about the causes of inflation is, in any case, an extremely thorny one. Different schools of economists provide different explanations.

One less orthodox account that I have frequently come across is that our fractional reserve banking system when combined with a central bank’s issuance of a fiat currency is inherently inflationary. That in the long term, and solely because of these extant monetary mechanisms, inflation is baked into the cake. So I wrote to a friend who holds with the above opinion and asked if he would explain “in the briefest terms that are sufficient” why he and others believe that central bank issuance of currency and fractional reserve banking are the primary underlying cause of inflation. Here is his succinct but detailed reply:

In a central bank system, money is created in the first instance by governments issuing bonds to banks and banks “printing” money and handing it over to the government in return. The government then owe the banks the money plus interest. If they ever pay back any of the principal, then a corresponding amount of bonds are handed back, i.e. cancelled. In that case, the money repaid goes out of existence!

Before elaborating any further, let’s take a step back. Fractional reserve lending doesn’t require central banks, nor does it require governments to create money by issuing bonds in exchange for it. Fractional reserve lending is simply the act of taking someone’s money to “look after it”, then turning around and lending a fraction of it to someone else. If the lender has enough depositors, then sum of all the unlent fractions of each deposit should cover him if one of them suddenly comes through the door asking for all their money back in one go. As I’m sure you know, if too many turn up at once looking for their money, a run ensues. Fractional reserve banking doesn’t even require a government sanctioned paper currency to exist. Depositors can simply deposit something like gold and the lenders can issue receipts which become the paper currency.

In olden times, when depositors of gold first found out that the goldsmiths they were paying to store their gold safely were lending it out for a percentage fee, they were outraged. The goldsmiths appeased them by offering them a cut of the fee for their interest in the scam. Accordingly, this money became known as ‘interest’.

So where do central banks fit in? Countries like the Unites States prior to 1913 have operated without central banks. There were thousands of banks of all sizes. To compete with one another, they had to endeavour to offer higher interest to depositors, lower interest rates to borrowers or to cut the fraction of deposits that they kept in reserve. This latter aspect was what caused banks occasionally to go to the wall, to the detriment of their depositors.

Central banking avoids this risk because the same fractional reserve ratio applies to all the banks under a central bank’s jurisdiction. However, it is really a way to avoid competition and if the system ever does get into trouble, the government feel obliged to bail it out or risk collapse of the whole system.

Now to answer your question about inflation.

In a fractional reserve central bank system, money is created as I’ve described by the government issuing bonds to the bank, receiving money created out of thin air and having to pay interest on it. When they spend it by paying salaries of government employees, contractors, arms manufacturers and so on, that money goes straight into bank accounts and the bankers can’t wait to lend out as much of it as possible, up to the limit of whatever fractional reserve ratio applies. So now there is a double claim on the money. The government employee thinks their salary is sitting in the bank but 90 percent of it is in the pocket of a borrower who thinks it’s theirs as long as they keep up with interest. That borrower, will inevitably either put the borrowed sum in their own bank account or spend it. Either way it will end up in another bank account somewhere. Then the same thing happens again; up to 90 percent of it gets lent out (81 percent of the original government-created money) and so on…

We end up in a situation where all of the money in circulation has arisen from someone somewhere, signing the dotted line to put themselves in debt. The money isn’t backed by a commodity such as gold. Instead it is backed by the ability of the borrower to repay. All these borrowers, including the government are paying interest. If interest is to be paid on every penny in circulation, then it doesn’t take a genius to figure out that new money must be continuously ‘created’ to keep paying this. That occurs by governments constantly borrowing so that their debts keep on increasing and borrowers constantly borrowing more and more. This seems to work as long as prices, wages and asset values keep increasing. Generation after generation, workers can afford to pay more and more for the houses that they live in because the price of the house keeps going up so it looks like good collateral to the lender and also their wages keep going up, so the borrower can meet payments in the eyes of the lender.

Working out what the rate of inflation is at any given time is practically impossible. Government figures such as RPI and CPI are just another tool for the propagandists to use as they see fit at any given time. However for the banks to gain anything from the game, the rate of inflation must be:

  • less than the rate of interest paid by borrowers and;
  • greater than the rate of interest paid to savers.

This is why savers money is ‘eroded’ if they just leave it sitting in a bank account.
Now imagine a different system where:

  • governments issue paper money by printing it themselves;
  • the amount in circulation is absolutely fixed;
  • there is no central bank but there are plenty of independent banks.

In such a country, there is no need for the government to have any debt and there is ample historical evidence of nations that have existed without government debt for very long stretches of time. What borrowers there are have to find the interest by earning it from the fixed pool of currency that is in circulation. There is little need for anyone to borrow but that’s something that most people you speak to have difficulty accepting. That’s because they’ve only ever lived in a system where they spend their lives in the service of debt and cannot conceive of it being any different.

The bankers right at the top of the system aren’t out to grab hold of all the money in the world. They’re not after all the tangible in the world either. Their only goal is to ensure that as much human labour as possible is in the service of debt.

Now for something different. How can this whole thing go horribly wrong for the bankers? I don’t just mean a run on banks or a recession. That happens periodically and is known as the business cycle. People lose confidence and are reluctant to borrow for a number of years, then they regain confidence and start to borrow again and the whole thing picks up and the cycle repeats.

What can go horribly wrong is if, after generations and generations and generations of increasing prices and debts, everyone gets more spooked by debt than ever before and totally fixated on repaying it. They sell assets but there are so many folk doing that that asset prices start to decline. That spooks people further. A spiral is under way. Banks try to ‘stimulate’ the economy by lowering interest rates but there is very little confidence around, especially if asset prices are declining compared with debts and wages aren’t rising either (or may be in decline), so that the ability to repay debt is impaired. This decline can be long and protracted. Also there can be many ups and downs along the way, although the long term trend is down. Ups can be deceptive as they are perceived as “coming out of the recession” by those used to the normal business cycles we’ve experienced throughout the whole of the twentieth century. In this way, asset prices can bleed away until eventually they reach something like a tenth of of their peak value. This process can reach a very late stage before a lot of people recognise what’s really going on. This is just a scenario but one worth considering seriously. We could be in for long term deflation but it will be well under way and too late for many people in debt by the time it gets mainstream acknowledgement.

A closely-related question and one that automatically follows is why do countries bother having central banks at all? Instead of a government issuing bonds, why not directly issue the currency instead, thereby cutting out the middle men? It is an approach that actually has a number of historical precedents as pointed out in this open letter to Obama urging him to reissue ‘greenbacks’ and the campaign in Britain to print ‘treasury notes’ like the Bradbury Pound. So in a further reply to my friend I asked him, “do you think that the re-issuance of ‘greenbacks’ in America or the Bradbury Pound in the UK might offer a realistic solution to the current crisis?” His response:

The issue of greenbacks or whatever you call them (essentially government-issued money) would probably make no immediate difference. Already, the money created by quantitative easing is not working its way into the system, so why would money issued by any other means?

In the longer term, such a fundamental upheaval would make a huge difference as the government wouldn’t need to be in debt the whole time and people wouldn’t have to keep paying increasing prices for houses and cars on top of interest. Pensioners wouldn’t be on a treadmill, having to ‘invest’ their savings just in vain an effort to keep up with inflation.

There’s a risk that the government might be tempted to print more and more money, which is often cited as a point in favour of the present system. It is claimed that having to pay interest and ultimately repay the whole principal is a disincentive in this respect. However, the current system ensures constant “printing” all the time as there’s no way that everyone involved can pay interest otherwise.

There’s talk at the moment about banks charging people a few percent for holding their money on deposit, i.e “negative interest”. People think they’ll lose money as their account balances will go down over time. However, it’s no different to being paid say six percent interest at a time when inflation is at 9 percent and the cheapest loan you can get is 12 percent.

I’m amazed at how people in the alternative media can inform us that banks are going to charge us ‘negative interest’ for our deposits, express outrage and then in the next breath claim that we’re in a hyperinflationary environment. Low/negative interest is a sure sign of massive deflationary pressure. I don’t know what’s going to happen but I’m convinced that deflation’s the one to watch. It has the potential to catch people out.

Getting back to your original question, the direct issuing of money by the government would represent a seismic shift of power from bankers to governments; a shift in the right direction, no doubt. It’s only possible if everyone knows what’s exactly going on. We’re a very long way off yet. Peoples’ understanding of the banking scam is very very poor.

I would add that very much front and centre in that scam is the role of the central banks. These extraordinarily powerful commercial bodies that adopt the outward appearance of public institutions when in fact they work for commercial interests. The US Federal Reserve, for instance, is a de facto private corporation and all of its shareholders are private banks. The status of the Bank of England is more complicated. This is what the main wikipedia entry intriguingly has to tell us:

Established in 1694, it is the second oldest central bank in the world, after the Sveriges Riksbank, and the world’s 8th oldest bank. It was established to act as the English Government’s banker, and is still the banker for HM Government. The Bank was privately owned [clarification needed (Privately owned by whom? See talk page.)] from its foundation in 1694 until nationalised in 1946.[3][4] 

Original references retained.

Clarification needed indeed! Anyway, nowadays it is officially (since 1998) an ‘independent public organisation’. However, the BoE is not really as independent as it might first appear, since along with eighteen other central banks from around the world (including the US Federal Reserve) it is a member of the executive of “the central bank for central banks” – the little known Bank for International Settlements (BIS) based in Basel, Switzerland. To hear more about the history, ownership and function of this highly profitable (tax free and extraterritorial) organisation, I recommend listening to this interview with Adam LeBor, author of the recently released book The Tower of Basel:

For my own more detailed thoughts on effective remedies to the on-going financial crisis please read this earlier post.


Please note that for the purposes of ‘publishing’ here I have taken advantage of the option to incorporate hypertext links and embed videos – in order to distinguish additional commentary from the original text all newly incorporated text has been italised.


1 From “The Future”, Essays in Persuasion (1931) Ch. 5, John Maynard Keynes, CW, IX, pp.329 — 331, Economic Possibilities for our Grandchildren (1930).

2 Adam Smith applied “the law of diminishing utility” to solve “the paradox of water and diamonds”. Water is a vital resource and most precious to life and yet it is far less expensive to purchase than diamonds, comparatively useless shiny crystals, which in his own times would have been used solely for ornamentation or engraving. The reason, Smith decides, is that water is readily abundant, such that any loss or gain is of little concern to most people in most places. By contrast, the rarity of diamonds means that, although less useful overall, any loss or gain of use is more significant, or to put it more formally the “marginal utility” is greater.

3 Extract taken from The soul of man under socialism by Oscar Wilde (first published 1891).

4 Legal tender is a technical legal term that basically means an offer of payment that cannot be refused in settlement of a debt.

5 Fiat (Latin), “let it be done” meaning that these currencies are guaranteed by government decree only.

6 Milton Friedman pays homage to Ronald Reagan’s record on deregulation in an essay entitled “Freedom’s friend” published in the Wall Street Journal on June 11, 2004. Drawing evidence from The Federal Register, “records the thousands of detailed rules and regulations that federal agencies churn out in the course of a year”, Friedman contrasts Reagan’s record with that of Presidential incumbents before and since: “They [the rules and regulations] are not laws and yet they have the effect of laws and like laws impose costs and restrain activities. Here too, the period before President Reagan was one of galloping socialism. The Reagan years were ones of retreating socialism, and the post-Reagan years, of creeping socialism.” For socialism read regulation.

7 Definition of “too big to fail” taken from “Idea that certain businesses are so important to the nation, that it would be disastrous if they were allowed to fail. This term is often applied to some of the nation’s largest banks, because if these banks were to fail, it could cause serious problems for the economy. By declaring a company too big to fail, however, it means that the government might be tempted to step in if this company gets into a bad situation, either due to problems within the company or problems from outside the company. While government bailouts or intervention might help the company survive, some opponents think that this is counterproductive, and simply helping a company that maybe should be allowed to fail. This concept was integral to the financial crisis of the late 2000s.”

8 According to IMF economic database for October 2010, World GDP is $61,963.429 billion (US dollars).

9 Unto This Last is based on a collection of four essays first published in the monthly Cornhill Magazine, 1860, and then reprinted as Unto This Last in 1862. This extract is drawn from his second essay: “The Veins of Wealth”

10 George Soros proudly explains the events of “Black Wednesday” on his official website: “In 1992, with the economy of the United Kingdom in recession, Quantum Fund’s managers anticipated that British authorities would be forced to break from the European Exchange Rate Mechanism (ERM) then in force and allow the British pound to devalue in relation to other currencies, in particular the German mark. Quantum Fund sold short (betting on a decline in value) more than $10 billion worth of pounds sterling. On September 16, 1992—later dubbed “Black Wednesday”—the British government abandoned the ERM and the pound was devalued by twenty percent.”

11Last year [1979] Bunker and his syndicate began buying silver again, this time on a truly gargantuan scale. They were soon imitated by other speculators shaken by international crises and distrustful of paper money. It was this that sent the price of silver from $6 per oz. in early 1979 to $50 per oz. in January of this year. Chairman Walter Hoving of Tiffany & Co., the famous jewelry store, was incensed. Tiffany ran an ad in the New York Times last week asserting: ‘We think it is unconscionable for anyone to hoard several billion, yes billion, dollars worth of silver and thus drive the price up so high that others must pay artificially high prices for articles made of silver from baby spoons to tea sets, as well as photographic film and other products.’” Extract taken from “He Has a Passion for Silver”, article published in Time Magazine, Monday 7April, 1980.,9171,921964-2,00.html

12Many Government officials feared that if the Hunts were unable to meet all their debts, some Wall Street brokerage firms and some large banks might collapse.” Extract taken from “Bunker’s busted silver bubble”, article published in Time Magazine, Monday 12 May, 1980.,9171,920875,00.html

13What may deal the Hunt fortune a fatal blow is the fallout from the brothers’ role in the great silver-price boom and bust of 1980. Thousands of investors who lost money in the debacle are suing the Hunts. On Saturday the brothers lost a civil case that could set an ominous precedent. A six-member federal jury in New York City found that the Hunts conspired to corner the silver market, and held them liable to pay $63 million in damages to Minpeco, a Peruvian mineral-marketing company that suffered heavy losses in the silver crash. Under federal antitrust law, the penalty is automatically tripled to $189 million, but after subtractions for previous settlements with Minpeco, the total value of the judgment against the Hunts is $134 million.” Extract taken from “Big bill for a bullion binge”, article published in Time Magazine, Monday 29 August, 1988.,9171,968272-1,00.html

14 Extract also taken from the second essay, entitled: “The Veins of Wealth” of Unto This Last by John Ruskin.

15 Ibid.

Leave a comment

Filed under analysis & opinion, « finishing the rat race », financial derivatives, Max Keiser, neo-liberalism

the clouds of not knowing

The following article is Chapter Ten of a book entitled Finishing The Rat Race which I am posting, beginning today, chapter by chapter throughout this year. Since blog posts are stacked in a reverse time sequence (always with the latest at the top), I have decided that the best approach is to post the chapters in reverse order.

All previously uploaded chapters are available (in sequence) by following the link above or from category link in the main menu, where you will also find a brief introductory article about the book itself and why I started writing it.


According to the postmodernists there is no such thing as absolute truth, so why should we believe them?”

question submitted to the regular Notes & Queries column in The Guardian.


Postmodernism is a slippery subject and one I’ve long endeavoured to get to grips with.

For a while I just tried asking dumb questions (applying a method of inquiry recommended by physicist Richard Feynman). “What exactly is postmodernism?” seemed like a good starter, although as I soon realised such a front-on assault wouldn’t get me very far. Quasi-mathematical answers floated back about ‘signs’ and ‘signifiers’ from the arcane sub-discipline of ‘semiotics’, or else esoteric reference to the foreign fields of ‘post-structuralism’ and ‘deconstructionism’. I also had to understand such important issues as ‘false consciousness’, ‘the death of the author’ and ‘the end of the grand narrative’. Slowly then, I learnt about this complex spaghetti of postmodernist theory, a theory more beloved by English Literature professors than readers of philosophy, yet a theory pushed by its outspoken advocates who regard it as the only rightful context for all other intellectual inquiry.


After years of discussion with defenders and proponents of postmodernist theory I have come to an understanding that there are basically two main strands often twisted into one. Here, however, I must confess that I find the majority of writings on postmodernist thinking to be dense, jargonistic and for the most part unintelligible, so I do not claim to be an expert by any means. But, in this regard I was very happy to discover that I was sat in the dunce’s corner with, amongst other dullards, that otherwise academically esteemed professor of linguistics, Noam Chomsky. Here’s what Chomsky has to say:

“Since no one has succeeded in showing me what I’m missing, we’re left with the second option: I’m just incapable of understanding. I’m certainly willing to grant that it may be true, though I’m afraid I’ll have to remain suspicious, for what seem good reasons. There are lots of things I don’t understand – say, the latest debates over whether neutrinos have mass or the way that Fermat’s last theorem was (apparently) proven recently. But from 50 years in this game, I have learned two things: (1) I can ask friends who work in these areas to explain it to me at a level that I can understand, and they can do so, without particular difficulty; (2) if I’m interested, I can proceed to learn more so that I will come to understand it. Now Derrida, Lacan, Lyotard, Kristeva, etc. – even Foucault, whom I knew and liked, and who was somewhat different from the rest – write things that I also don’t understand, but (1) and (2) don’t hold: no one who says they do understand can explain it to me and I haven’t a clue as to how to proceed to overcome my failures.

“I would simply suggest that you ask those who tell you about the wonders of “theory” and “philosophy” to justify their claims – to do what people in physics, math, biology, linguistics, and other fields are happy to do when someone asks them, seriously, what are the principles of their theories, on what evidence are they based, what do they explain that wasn’t already obvious, etc. These are fair requests for anyone to make. If they can’t be met, then I’d suggest recourse to Hume’s advice in similar circumstances: to the flames.”1


With this in mind, please allow me to unravel the two strands of postmodernism (as I find them).

i) postmodernism as a contemporary aesthetic.

On the one hand postmodernism promotes the idea of a new aesthetic. An aesthetic born from the ashes of modernism that it usurped. The fall of religion, of classical physics, as well as of other established and seemingly apodictic systems, had sparked a fin de siècle revolution around the turn of the twentieth century, and in consequence, artists looked for new modes of expression. The aftermath of two world wars heightened this need for a new awakening. One artistic response has been to recognise that the loss of a grounding on the basis of some kind of universal referent is intractable, and thus to turn inwards. To search for inspiration in the exploration of relationships between the artist and the subjective unreliability of their own account. To elevate context above meaning, subtext above text, and to make style and form themselves, the primary subjects of the artist.

Now I think that this is a perfectly reasonable place for artists to go. Artists after all are free to go as and where they choose (as are all citizens in any healthy political climate). Within the bounds of legality and, aside from the important issue of earning a living wage, artists are bounded only by the development of their creative and imaginative faculties. Choosing to explore the world as they find it (in realism), or of their own emotions (Romanticism), or what is discovered in the unconscious (surrealism), or even ideas in and of themselves (conceptualism) is therefore a matter wholly at the discretion of the artist. Whether they take on board styles from the past or other cultures, manipulate and meld them into a new eclecticism, or else, like Duchamps, point with irony at the question of what is art itself, then good for them. And if this is the current fashion, then so be it. Whether or not these pursuits are deemed in any way successful will be judged both here and in the future, as always. Fashions in every field coming and going as they do. All of this I accept.

Now if this is all postmodernism ever had to say, then let it be said, but let it also be said that there is nothing particularly ‘modern’ about it, let alone ‘post’…

Shakespeare made many allusions to the theatre itself, and liked to include plays within his plays. Shifting the audience’s perspective with reminders that we are another part of a performance and long before Berthold Brecht had snapped his fingers to wake us to our own participation. Lawrence Stern’s Tristam Shandy, one of the earliest novels in the English language, is a work more famous and celebrated for being so self-referential. More recently, Rene Magritte’s paintings challenge relationships between images, words and the world; whilst in early cartoons we can also find such ‘postmodern’ devices, as, for example, when Bugs Bunny becomes Daffy’s animator in the splendid Duck Amuck. Such is the success of these games of form and reference within purely comedic settings that even that most hackneyed of old jokes “why did the chicken cross the road?” relies on an audience who understands its cultural reference to jokes more generally – that jokes have a punchline, and so the joke here is that there isn’t one. Context has become everything, and what could be more ‘postmodern’ than that?

ii) postmodernism as a theory against absolutes

My first brush with postmodernism happened almost two decades ago when, as a postgraduate student, I’d suddenly begun to mix within altogether more literary circles. During my three years of studying physics in London I’d never once encountered any reference to the ideas of Saussure, Derrida, Lacan, Foucault or Baudrillard, but suddenly I had a few English post-grads telling me that physics, and indeed science in general, was just another theory, and one holding no special claims to finding an understanding of nature than any other. At first this seemed hilarious. How, I wondered, could those who knew next to nothing with regards to, say, Newton’s laws of motion, be so smug in their opinions about the truth or otherwise of quantum mechanics and relativity. Studying science had at least taught me not to be so presumptuous. So just what had gotten into them?

Jacques Derrida2 famously wrote that “there is nothing outside the text”, which is an extraordinary thing to write when you think about it. I mean is Derrida quite literally saying that nothing exists beyond the text? Why of course not, you dingo! For if nothing existed beyond the text, then there couldn’t be any text, since there’d be no one to write it in the first instance. Surely that’s obvious enough! So what does he mean?

In my handy guide Postmodernism for Beginners3, which at least has the good grace to include plenty of nice pictures, there is a section entitled ‘Deconstruction’, which was (according to the book) Derrida’s method for waging “a one-man ‘deconstructionist’ war against the entire Western tradition of rationalist thought.” His new approach of deconstruction, the book goes on to say, being an attempt “to peel away like an onion the layers of constructed meaning.” But of course if you peel away the layers of a real onion you’re eventually left with nothing… which is something the book’s analogy fails to mention.

And just what is Derrida’s method of deconstruction? An attempt to look for meanings in the text that were “suppressed or assumed in order for it to take its actual form”. I’m quoting from my book again. But then how is anyone supposed to do this? Well, here again I confess that I really don’t know – and the book is only a beginners’ guide so unfortunately it doesn’t say. I can however recall the story told by a friend who was studying for a degree in English Literature. He told me that his tutor had once asked a seminar group to read a selected text with the express intention of misunderstanding the author. So I guess that’s one approach.4

Now I concede that all critical readers must have due entitlement to read between the author’s lines. Anyone with a modicum of sense must recognise that an artist will at times disguise their true intentions (especially if they involve dangerous political or religious dissent); dressing their concealed truths in fitting uniforms. Of course the author may also wish to veil themselves for altogether more personal or private reasons. But then why precedent the latent above the blatant anyway? As if what an author tries to hide is more important than what they are, more directly, seeming to say. To address this question, postmodernists broaden their case, saying that ‘meaning’ itself is wholly dependent upon ‘authority’ or ‘power’. This is to say that the artist is nothing more than a product of the cultural context of his or her time. According to such reasoning, whatever it was they’d meant to say becomes irrelevant. A depressing claim, and one that lacks any obvious foundation. And where is the broader point to all of this? What does it have to do with science for instance?

Well, Derrida contends that the word ‘text’ must be understood in “the semiological sense of extended discourses.” Any clearer? No – try this: “all practices of interpretation which include, but are not limited to, language.” Got it yet? I’ll put it more picturequesly. Away from the leafy seclusion of literature departments, Derrida is declaring that this same approach (his approach) must be applied to all avenues of thinking. Any special privilege for methods of reason and objectivity is to be absolutely refused on grounds that once we are agreed that all discourse (in the semiological sense) is necessarily a cultural, historical or linguistic construct, then all ideas must be seen to hold the same indeterminate value. Therefore, to raise science above other disciplines of enquiry is merely “a value judgement” borne of European prejudice and vanity.

So what finally does this all amount to? Does Derrida really claim that astronomy can be judged to be no better measure of our universe than astrology? Or that when Galileo proposed the idea that the earth moved around the Sun, the pope was no less right for saying that it did not? Or if we proclaim that the world is round, are we no closer to any kind of truth than the legendary flat-earthers? And when we build rockets that fly to the moon and beyond, that this does not prove Newton’s ideas over those of Aristotle? The same Aristotle who thought that the moon was made not of rock, since rock would inevitably crash to earth, but from a fabulous unearthly material called quintessence! And what if Jacques Derrida were to have taken some leap of faith from his window, might he have hovered in the air like Road Runner, or would he more surely have accelerated toward the ground at 9.81 metres per second per second? I certainly know where my money’s riding.


Now in case you think my objections are unfounded, and based on either my lack of knowledge of the subject or else a deliberate and calculated misinterpretation of postmodernist thinking (whatever that means given the postmodernists’ own refusal to privilege an author’s intentions on the grounds that these are unrecoverable and irrelevant), I feel that I must draw attention to an incident now referred to as The Sokal Affair.

In 1996, Alan Sokal, a professor of physics at New York University, feeling frustrated by the nihilistic claims being made by the postmodernists, decided (as any good scientist would) to perform an experiment. His hypothesis (if you like) being that he could convince a reputable journal in the field to: “publish an article liberally salted with nonsense if (a) it sounded good and (b) it flattered the editors’ ideological preconceptions.” On this basis he submitted a paper entitled “Transgressing the Boundaries: Towards a Transformative Hermeneutics of Quantum Gravity” to the journal Social Text. To give you a flavour of Sokal’s admirable hoax, here is an extract from that paper:

“Derrida’s perceptive reply went to the heart of classical general relativity: The Einsteinian constant is not a constant, is not a center. It is the very concept of variability – it is, finally, the concept of the game. In other words, it is not the concept of something – of a center starting from which an observer could master the field – but the very concept of the game… “

Outlandish nonsense, of course, but (and no doubt to Sokal’s great delight) the journal mistook his fun for a work worthy of publication5. Then, on the same day of its publication, Sokal announced his hoax in a different journal, Lingua Franca, calling his published paper “a pastiche of left-wing cant, fawning references, grandiose quotations, and outright nonsense”, which was “structured around the silliest quotations I could find about mathematics and physics”6. Here is what Sokal himself had to say about his reasons for perpetrating the hoax and his underlying concerns regarding the influence of the Social Text editors. He has a great deal to say and so I feel it is fitting to give over the remainder of this section to Sokal’s own justification and conclusions (after all, why have a dog and bark yourself):

“Of course, I’m not oblivious to the ethical issues involved in my rather unorthodox experiment. Professional communities operate largely on trust; deception undercuts that trust. But it is important to understand exactly what I did. My article is a theoretical essay based entirely on publicly available sources, all of which I have meticulously footnoted. All works cited are real, and all quotations are rigorously accurate; none are invented. Now, it’s true that the author doesn’t believe his own argument. But why should that matter? … If the Social Text editors find my arguments convincing, then why should they be disconcerted simply because I don’t? Or are they more deferent to the so-called “cultural authority of technoscience” than they would care to admit? […]

“The fundamental silliness of my article lies, however, not in its numerous solecisms but in the dubiousness of its central thesis and of the “reasoning” adduced to support it. Basically, I claim that quantum gravity — the still-speculative theory of space and time on scales of a millionth of a billionth of a billionth of a billionth of a centimeter – has profound political implications (which, of course, are “progressive”). In support of this improbable proposition, I proceed as follows: First, I quote some controversial philosophical pronouncements of Heisenberg and Bohr, and assert (without argument) that quantum physics is profoundly consonant with “postmodernist epistemology.” Next, I assemble a pastiche – Derrida and general relativity, Lacan and topology, Irigaray and quantum gravity – held together by vague rhetoric about “nonlinearity”, “flux” and “interconnectedness.” Finally, I jump (again without argument) to the assertion that “postmodern science” has abolished the concept of objective reality. Nowhere in all of this is there anything resembling a logical sequence of thought; one finds only citations of authority, plays on words, strained analogies, and bald assertions.7

Why did I do it? While my method was satirical, my motivation is utterly serious. What concerns me is the proliferation, not just of nonsense and sloppy thinking per se, but of a particular kind of nonsense and sloppy thinking: one that denies the existence of objective realities, or (when challenged) admits their existence but downplays their practical relevance. …

“In short, my concern over the spread of subjectivist thinking is both intellectual and political. Intellectually, the problem with such doctrines is that they are false (when not simply meaningless). There is a real world; its properties are not merely social constructions; facts and evidence do matter. What sane person would contend otherwise? …

“Social Text’s acceptance of my article exemplifies the intellectual arrogance of Theory – meaning postmodernist literary theory – carried to its logical extreme. No wonder they didn’t bother to consult a physicist. If all is discourse and “text,” then knowledge of the real world is superfluous; even physics becomes just another branch of Cultural Studies. If, moreover, all is rhetoric and “language games,” then internal logical consistency is superfluous too: a patina of theoretical sophistication serves equally well. Incomprehensibility becomes a virtue; allusions, metaphors and puns substitute for evidence and logic. My own article is, if anything, an extremely modest example of this well-established genre. …

“Politically, I’m angered because most (though not all) of this silliness is emanating from the self-proclaimed Left. We’re witnessing here a profound historical volte-face. For most of the past two centuries, the Left has been identified with science and against obscurantism; we have believed that rational thought and the fearless analysis of objective reality (both natural and social) are incisive tools for combating the mystifications promoted by the powerful – not to mention being desirable human ends in their own right. The recent turn of many “progressive” or “leftist” academic humanists and social scientists toward one or another form of epistemic relativism betrays this worthy heritage…

“I say this not in glee but in sadness. After all, I’m a leftist too (under the Sandinista government I taught mathematics at the National University of Nicaragua)… But I’m a leftist (and feminist) because of evidence and logic, not in spite of it.”8


It has long puzzled me too, why many once dyed-in-the-wool Marxists have increasingly drifted over to Derrida. I mean these two systems are supposedly in direct contradiction. Marxism is a ‘grand metanarrative’ par excellence, and so postmodernism is presumably its willing nemesis. So why would those who had invested so heavily in Marx suddenly jump into bed with Derrida et al? Well, it might be supposed that the fall of the Berlin Wall was of key importance here.

With the end of the Soviet experiment, it wasn’t simply a political regime that had given way. In its wake the whole Marxist ideology was rocked, since, and whatever its adherents may have then believed, this rapid and extraordinary sequence of events signified the catastrophic end to that particular alternative world vision.9

It’s not even that Marxists were still looking longingly toward Russia for their answers – most had already long accepted that the Soviet dream died with Stalin if not before – but just as with the death of a friend, it’s not until the funeral that we can finally say farewell. For those who’d searched for answers under the lens of Marxism, a time was rapidly approaching when most would be forced to admit defeat. That finally there was nothing left to halt the rising tide of global capitalism. Unless…

But lo! Could some new theory, of revolutionary hue, if significantly altered, replace the discarded doctrines of Marxism? Perhaps there was still something yet that might save the world from the savagery of unchallenged global capitalism. Soon these were the hard questions facing not only the Marxists but all those with Socialist leanings. And as a Leftist too, I shared in the same concerns.

Not that Marxism is dead of course. Not quite. Though Marx appears to be a spent political force, his spell, if diminished, is still potent inside the faculties of academia, living on in the alcoves of English departments for instance (and often side by side with Derrida and the others). But my question is how did Derrida step into Marx’s boots so comfortably? Is there any deeper reason why Marx and Derrida have made such good bedfellows? Is there anything that these adversaries might actually share?


I recently came across a review of philosopher Daniel Dennett’s book Breaking the Spell – his inquiry into the origins of religion (a popular subject these days) – and have since been considering whether or not to include any mention of it (perhaps with reference to my thoughts in Chapter One). Well, as you will know already, presuming you’ve read everything thus far, I have so far avoided making any direct reference to Dennett’s book as such. Instead, and by way of a brief and hopefully interesting digression, I have decided to present a review of the review itself. Quite aside from being in-keeping to offer such a meta-narrative, the review itself, which happened to feature on a website otherwise dedicated to “world socialism”, helped to shed light on the current theme of the odd convergence between postmodernist theory and Marxism. But before I can progress, I first need to briefly outline the main thrust in Dennett’s book itself, which, when stated most succinctly, is that religion is a natural phenomenon.

There is an evolutionary advantage, Dennett says in Breaking the Spell, conferred to those who adopt “the intentional stance”: our very reasonable presumption that the other creatures one encounters are also “agents”. It is easy to understand then, by extension, Dennett continues, why natural forces in general might also be presumed to act rationally and with specific desires in mind.

Combined with this, as Dennett also points out, the offspring of many species, including humans, are innately trusting toward their parents, because, happily, this also confers a survival advantage. These factors taken together then, it is easy to understand how a worship of ancestors might have arisen as a useful bi-product of human evolution. Whilst, on the cultural level, as the earlier hunter and gatherer communities gave way to agricultural settlement, this opened the way to more formalised and stratified forms of religion that must have slowly arisen – religion then, according to Dennett, is a piece, if you like, of mankind’s extended phenotype (yet another natural/cultural artefact, and, as such, somewhat akin to the motor car or Aswan Dam, none of which are any less “natural” than say a bird’s nest or a beaver’s lodge). And thus, being natural in origin, religion itself becomes a proper subject for scientific investigation, just as all other natural phenomena lie the within the province of scientific analysis.

The spell that Dennett finally wishes us to break from being that religion is fundamentally no different from any other kind of human behaviour or enterprise. That much is all Dennett – at least according to our reviewer.

Dennett’s approach is not really to my taste. It leans too heavily on the speculative theories of evolutionary psychology, whilst in doing so, stretching the concept of “natural” to such a degree as to render the word close to meaningless. But worse than that, he leaves little or no room for the insoluble cosmic riddle itself, when this is surely a vital component in any proper understanding of what drives the religious impulse. So this is my review, second hand of course (since I am not intrigued enough to read Dennett’s original words).

Firstly, our reviewer acknowledges that much of the book is admirable, in so far as it goes, but then he insists that Dennett misses the main point. And the main point? Well, from the reviewer’s perspective Dennett simply isn’t being Marxist enough. Remember, this is a Marxist review!

In order to grasp the infernal bull of religion properly by the horns you need to understand Marx, the reviewer goes on. Why? Because Marx recognised how religion retards “class consciousness” amongst the proletariat, famously calling it “the opium of the masses” and “the sigh of the oppressed”. Religion then, according to Marx, is a comforting but ultimately false light: its promises of heavenly paradise, a necessary distraction from the injustices of the real world. At root, it is a necessary means of mollifying the proletariat masses. And who can doubt how often religion has and does serve precisely such ends – although we didn’t we actually needed Marx to tell us so. Thinkers back to Voltaire (and long before him) have repeated proffered that same opinion.10 Which is where I’ll finally come back to postmodernism, deconstruction and Derrida.

Here’s the actual sentence in the review that snagged my attention, causing me to make a connection that had perhaps been obvious all along:

“[But] Marxism does recognize that material factors are ultimately to be found at the root of all ideology, of which religion is a part.”11 (Emphasis added.)

Soon afterwards the reviewer backs this same assertion with a quote taken directly from Engels:

“Still higher ideologies, that is, such as are still further removed from the material, economic basis, take the form of philosophy and religion. Here the interconnection between conceptions and their material conditions of existence becomes more and more complicated, more and more obscured by intermediate links. But the interconnection exists.”12

Suddenly, it can all be fitted together. Since for the Marxists too, not just religion, but all “higher ideologies”, might be whittled back to their cultural and historical constructs. A deconstruction almost worthy of Derrida, with the difference being in the placement of emphasis: for Engels the cultural and historic conditions being “material”, whereas for Derrida they are “semiotic” – whatever that exactly means.

Marxism is an entirely Capitalist heresy, said the late political satirist Gore Vidal, adding, just as Capitalism was itself a Christian heresy. Not that these ideologies are by essence one and the same, no more than it automatically follows that since a frog develops from a tadpole, both creatures are inherently identical and indistinguishable. Vidal’s point is simply that these three mutually antagonistic doctrines, Christianity, Capitalism and Marxism, are closely related by origins.

Following on then, postmodernism ought to be understood as a Marxist heresy, and thus, by extension, just another in a line of Christian heresies. It is, to extend Gore Vidal’s insightful analysis, a cousin of Christianity twice-removed. Or look at it this way: when Derrida says, “there is nothing outside the text”, is he saying anything so radically different from “The Word is God”? The circle, it seems, is complete.


But I cannot finish the chapter here. For though it is certainly fair to draw comparisons between the “social constructs” of postmodernism and the “false consciousness” of Marx, it is unfair to judge them as equals. Marx never denied the possibility of “true consciousness”, since this is, broadly speaking, his goal. Derrida’s approach is altogether foggier, whilst rejoicing in the rejection of all “logocentric” reason. So determined to escape from every possible kind of absolutism, the dangers of which are evident enough, he finally leads himself and his followers into the shifting sands of relativism. Once there, and afraid to face up to truth in any shape, this nihilism is thinly veiled by obscurantism and sophistry.

In 1966, when Jacques Derrida met Paul De Man they quickly became friends and colleagues. Independently and together, they continued to develop their theories of deconstruction. However, you won’t find any reference to Paul De Man in my Postmodernism for Beginners guide, because in recent years De Man has slipped a little off the pages. Why is this? Perhaps because after his death, evidence came to light that during the war he had been an active promoter of Nazism.

Some articles penned for the Belgian collaborationist newspaper, Le Soir, during the first years of the war, had indeed been explicitly antisemitic, referring to the “Jewish problem” and how it was “polluting” the contemporary culture. More shockingly, De Man had continued producing his albeit modest contribution to the Nazi propaganda machine, when he must surely have known that a genocide was taking place on his doorstep. In the wake of the first expulsion of Belgian Jews, as thousands were crushed into the cattle wagons, and driven from homes in Brussels to the horrors of Auschwitz, De Man had continued to peddle such poisonous nonsense. When news of De Man’s Nazi sympathies first came out, this story actually made the front page of the New York Times, generating a furore that seems a little surprising today. It provides a measure of how much De Man’s star has faded.

But then, in the aftermath of such shocking revelations, Derrida defended his old friend – as well as the reputation of their shared child: deconstruction. Aside from the appeals to justice and fairness, Derrida made use of his own deconstructive methods in articles such as the poetically titled “Like the sound of the sea deep within a shell: Paul De Man’s war” and then (in response to further criticism) “Biodegradables: Six Literary Fragments”. De Man must be understood within his cultural context, Derrida insisted throughout13.

In later years, Derrida quietly admitted that some texts (and ideologies) were more equal than others, even attesting to a Marxist element within his own branch of deconstruction (at least if Postmodernism for Beginners is to be believed). Whatever the case, in his defence of De Man, Derrida clearly understood how his slippery theory might profitably be used to paint black as grey and grey as white.14

It was precisely this same lurking danger that George Orwell had understood so well, and which he laid out so clearly within the covers of Nineteen Eighty-Four:

“The Party told you to reject the evidence of your eyes and ears. It was their final, most essential command. His [Winston Smith’s] heart sank as he thought of the enormous power arrayed against him, the ease with which any Party intellectual would overthrow him in debate, the subtle arguments which he would not be able to understand, much less answer. And yes he was in the right! They were wrong and he was right. The obvious, the silly, and the true had got to be defended. Truisms are true, hold on to that! The solid world exists, its laws do not change. Stones are hard, water is wet, objects unsupported fall towards the earth’s centre. With the feeling that he was speaking to O’Brien [an Inner Party official], and also that he was setting forth an important axiom, he wrote [in his secret diary]:

‘Freedom is the freedom to say that two plus two make four. If that is granted, all else follows.’”15


So much for the murk of postmodern unknowing. There are other ways to challenge logocentrism – that pursuit of certainty through reason that Derrida so detested. So I’d like to finish this chapter by dispelling the Occidental mists a little with thoughts from abroad.

The teachers of Ch’an or Zen Buddhism from centuries past also impressed upon their students that proper understanding cannot be grasped by the indelicate gloves of verbal or logical reasoning. However, in contrast to Derrida and the others, they did not confuse reason with objectivity.

One such teacher, Dofuku said: “In my opinion, truth is beyond affirmation or negation, for this is the way it moves.” Here then, to finish, a few alternative words on the complex relationship between language and the world. The first of these are lines taken from the Chinese tradition of Ch’an, from a collection written down in the thirteenth century16:

Words cannot describe everything.

The heart’s message cannot be delivered in words.

If one receives words literally, he will be lost.

If he tries to explain with words, he will not awaken to the world.

And here, a later Japanese Zen story called “Nothing exists”17 that cautions the student against the ever-fatal error of “mistaking the pointing finger for the Moon” by confusing any description of reality with reality itself:

Yamaoka Tesshu, as a young student of Zen, visited one master after another. He called upon Dokuon of Shokoku.

Desiring to show his attainment, he said: “The mind, Buddha, and sentient beings, after all, do not exist. The true nature of phenomena is emptiness. There is no realisation, no delusion, no sage, no mediocrity. There is no giving and nothing to be received.”

Dokuon, who had been smoking quietly, said nothing. Suddenly he whacked Yamaoka with his bamboo pipe. This made the youth quite angry.

“If nothing exists,” inquired Dokuon, “where did this anger come from?”


1 BEYOND NATIONS & NATIONALISMS: One World, Noam Chomsky on Post Modernism and Activism

From a discussion that took place on LBBS, Z-Magazine‘s Left On-Line Bulletin Board, 1997.

2 “So take Derrida, one of the grand old men. I thought I ought to at least be able to understand his Grammatology, so tried to read it. I could make out some of it, for example, the critical analysis of classical texts that I knew very well and had written about years before. I found the scholarship appalling, based on pathetic misreading; and the argument, such as it was, failed to come close to the kinds of standards I’ve been familiar with since virtually childhood.” Ibid.

3 All quotations without footnotes in this section are drawn from “Postmodernism for Beginners” by Richard Appignanesi and Chris Garratt, Icon Books Ltd. Whether or not these are the words of Jacques Derrida is not always made clear, but then why should we worry about authorship when as Bartes pointed out: “readers create their own meanings, regardless of the author’s intentions: the texts they use to do so are thus ever-shifting, unstable and open to question.” (p.74)

4 “As for the “deconstruction” that is carried out… I can’t comment, because most of it seems to me gibberish. But if this is just another sign of my incapacity to recognize profundities, the course to follow is clear: just restate the results to me in plain words that I can understand, and show why they are different from, or better than, what others had been doing long before and and have continued to do since without three-syllable words, incoherent sentences, inflated rhetoric that (to me, at least) is largely meaningless, etc. That will cure my deficiencies – of course, if they are curable; maybe they aren’t, a possibility to which I’ll return.” Noam Chomsky, source as above.

5 Published in Social Text #46/47 (spring/summer 1996) pp. 217-252. Duke University Press.

6 Sokal, Alan (May 1996). A Physicist Experiments With Cultural Studies. Lingua Franca.

7 He adds here that: “It’s understandable that the editors of Social Text were unable to evaluate critically the technical aspects of my article (which is exactly why they should have consulted a scientist). What’s more surprising is how readily they accepted my implication that the search for truth in science must be subordinated to a political agenda, and how oblivious they were to the article’s overall illogic.” Ibid.

8 For publishing Sokal’s original paper, the journal Social Text received Ig Nobel prize for literature (1996).

9 “The fall of the Berlin Wall did more than any of the books that I, or anybody else, has written, to persuade people that that was not the way to run an economy.” quote from free-market economist, Milton Friedman.

10 Voltaire, who was an outspoken critic of religious and, in particular, Catholic fanaticism, clearly understood and bravely acknowledged the relationship between church authority and political power more generally. In his Dictionnaire philosophique (1764), the main target of which is the Christian church, and its doctrinal belief in the supernatural, he wrote dryly: “As you know, the Inquisition is an admirable and wholly Christian invention to make the pope and the monks more powerful and turn a whole kingdom into hypocrites.”

11 “Dennett’s dangerous idea”: a review written by James Brookfield (6 November 2006) of Breaking the Spell: religion as a Natural Phenomenon, by Daniel Dennett, Viking Adult, 2006. Review taken from World Socialist Web Site published by the International Committee of the Fourth International (ICFI).

12 Ludwig Feuerbach and the End of Classical German Philosophy, Part 4: Marx, by Friedrich Engels, First Published: 1886, in Die Neue Zeit, and translated by Progress Publishers in 1946.

13 “First, Derrida argues, de Man is not responsible for all of the many evils of Nazism or for the Holocaust. To compare him to Mengele, as one writer did, is unjust. Second, it is unjust to read de Man’s later writings as an admission of guilt or responsibility – or as an attempt to deny responsibility – for what he did during World War II. Third, although de Man wrote a series of articles expressing the ideology of the occupation forces and one article which is blatantly antisemitic, it is unjust to judge his whole life based on that one episode in his youth. Fourth – and this is the most controversial point in his argument – Derrida suggests that de Man’s articles are not as damning as one might be led to expect when they are read in the appropriate context. According to Derrida, the explicit antisemitism of the worst article is equivocal, and it is hardly as bad as many other articles in Le Soir. …”

“Nor can one object that these two articles do not discuss deconstruction or employ deconstructive techniques. In fact, both possess interesting and sustained discussions of deconstruction and its place in the academy, as well as many passages explicitly offering and rejecting possible connections between deconstruction and justice, or between deconstruction on the one hand and fascism or totalitarianism on the other..” passages taken from Transcendental Deconstruction, Transcendent Justice, originally published in Mich. L. Rev. 1131 (1994) by Jack M. Balkin.

14 Jack Balkin, respected academic and defender of deconstructionism, acknowledges the dangers of following its relativistic course when it leads toward nihilism. He explains how Derrida betrays his own theory to avoid this error: “[First] Derrida offers deconstructive arguments that cut both ways: Although one can use deconstructive arguments to further what Derrida believes is just, one can also deconstruct in a different way to reach conclusions he would probably find very unjust. One can also question his careful choice of targets of deconstruction: One could just as easily have chosen different targets and, by deconstructing them, reach conclusions that he would find abhorrent. Thus, in each case, what makes Derrida’s deconstructive argument an argument for justice is not its use of deconstruction, but the selection of the particular text or concept to deconstruct and the way in which the particular deconstructive argument is wielded. I shall argue that Derrida’s encounter with justice really shows that deconstructive argument is a species of rhetoric, which can be used for different purposes depending upon the moral and political commitments of the deconstructor.”

This perfidy, Balkin celebrates, suggesting that Derrida’s new form of “transcendental deconstruction” be universally adopted: “Yet, in rising to respond to these critics, just as he had previously responded to the critics of de Man, Derrida offered examples of deconstructive argument that were not wholly consistent with all of his previous deconstructive writings. They are, however, consistent with the practice of deconstruction that I have advocated. This is Derrida’s perfidy, his betrayal of deconstruction. Yet it is a betrayal that I heartily endorse. …”

15 Quote taken from Nineteen Eighty-Four, Part 1, Chapter 7.

16 Ibid, p.123. Extract taken from The Gateless Gate by Ekai, called Mumon. Transcribed by Nyogen Senzaki and Paul Reps. [I have modified the final line to render a more poetic effect. The original reads: “If he tries to explain with words, he will not attain enlightenment in this life.” In making this small alteration I have tried to maintain the spirit of the original.]

17 Extract taken from Zen Flesh, Zen Bones, an anthology of Zen and pre-Zen writing compiled by Paul Reps, published by Penguin Books, reprinted in 2000, p.75.

1 Comment

Filed under analysis & opinion, « finishing the rat race », Noam Chomsky