Tag Archives: William James

the life lepidopteran

The following article is an Interlude between Parts I and II of a book entitled Finishing The Rat Race which I am posting chapter by chapter throughout this year. Since blog posts are stacked in a reverse time sequence (always with the latest at the top), I have decided that the best approach is to post the chapters in reverse order.

All previously uploaded chapters are available (in sequence) by following the link above or from category link in the main menu, where you will also find a brief introductory article about the book itself and why I started writing it.

*

“Once upon a time, I, Chuang Chou, dreamt I was a butterfly, fluttering hither and thither, to all intents and purposes a butterfly. I was conscious only of my happiness as a butterfly, unaware that I was Chou. Soon I awaked, and there I was, veritably myself again. Now I do not know whether I was then a man dreaming I was a butterfly, or whether I am now a butterfly, dreaming I am a man.”

— Chuang Tzu 1

*

Before proceeding further, I’d like to tell a joke:

A man walks into a doctor’s.

“Doctor, Doctor, I keep thinking I’m a moth,” the man says.

The doctor gives him a serious look. “Sorry, but I am not strictly qualified to help you” he replies, rubbing his chin earnestly before adding after a momentary pause, “You really need to see a psychiatrist.”

“Yes,” says the man, “but your light was on.”

*

There can be no doubting that each of us acts to a considerable extent in accordance to mental processes that are distantly beyond and often alien to our immediate conscious awareness and understanding. For instance, in general we draw breath without the least consideration, or raise an arm, perhaps to scratch ourselves, with scarcely a thought and zero comprehension of how we actually moved our hand and fingers to accomplish the act. And this everyday fact becomes more startling once we consider how even complex movements and sophisticated patterns of behaviour seem to originate without full conscious direction or awareness?

Consider walking for instance. After admittedly painstaking practice as infants, we soon become able to walk without ever thinking to swing our legs. Likewise, if we have learnt to drive, eventually we are able to manoeuvre a large vehicle with hardly more conscious effort that we apply to walking. The same is true for most daily tasks which are performed no less thoughtlessly and that, in spite of the intricacies, we often find boring and mundane. For instance, those who have been smokers may be able to perform the rather complicated act of rolling a cigarette without pausing from conversation. Indeed, deep contemplation will probably leave us more bewildered than anything by the mysterious coordinated manipulation of all eight fingers and opposing thumbs.

Stranger still is that our ordinary conversational speech proceeds before we have formed the fully conscious intent to utter our actual words! When I first heard this claim, it struck me as so unsettling that I automatically rejected it outright in what perhaps ought to be called a tongue-jerk reaction. (Not long afterwards I was drunk enough to stop worrying about the latent implications!) For considered dispassionately, it is self-evident that there simply isn’t sufficient time to construct each and every utterance consciously and in advance of the act of speaking; so our vocal ejaculations (as they once were unashamedly called) are just that – they are thrown out! Still further proof is provided by instances when gestures or words emerge in direct conflict to our expressed beliefs and ideas. Those embarrassing occasions when we blurt out what we know must never be spoken we call Freudian slips (and more on Freud below).

More positively, and especially when we enter ‘the zone’, each of us is able to accomplish complex physical acts – for instance, throwing, catching, or kicking a ball –before any conscious thought arises to do so. Indeed, anyone who has played a sport long enough will recall joyous moments when they have marvelled not only at their own impossible spontaneity, but the accompanying accuracy, deftness, nimbleness, and on very rare occasions even of enhanced physical strength. Likewise, urges, feelings, fears and sometimes the most profound insights can suddenly spring forth into “the back of our minds”, as if from nowhere. As a consequence, this apparent nowhere was eventually given a name. It has come to be known as “the preconscious”, “the subconscious” and more latterly, “the unconscious”.

What this means, of course, is that “I” am not what I ordinarily think I am, but in actuality a lesser aspect of a greater being who enjoys remarkable and considerable talents and abilities beyond “my own” since they lie outside “my” immediate grasp. In this way, we all have hidden depths that ought to give rise to astonishment, although for peculiar reasons of pride, instead we tend to feign ignorance of this everyday fact.

*

The person most popularly associated with the study of the human unconscious is Sigmund Freud, a pioneer in the field but by no means a discoverer. In fact philosopher and all-round genius Gottfried Leibniz is one with a prior claim to the discovery; making the suggestion that our conscious awareness may be influenced by “insensible stimuli”, which he called petites perceptions 2; while another giant of German philosophy, Immanuel Kant, subsequently proposed the existence of ideas lurking of which we are not fully aware, while admitting the apparent contradiction inherent in such a conjecture:

“To have ideas, and yet not be conscious of them, — there seems to be a contradiction in that; for how can we know that we have them, if we are not conscious of them? Nevertheless, we may become aware indirectly that we have an idea, although we be not directly cognizant of the same.” 3

Nor is it the case that Freud was first in attempting any kind of formal analysis of the make-up and workings of the human psyche. Already in 1890, William James had published his ground-breaking work Principles of Psychology, and though James was keen to explore and outline his principles for human psychology by “the description and explanation of states of consciousness”, rather than to plunge more deeply into the unknown, he was also fully aware of the potentiality of unconscious forces and made clear that any “‘explanation’ [of consciousness] must of course include the study of their causes, conditions and immediate consequences, so far as these can be ascertained.” 4

*

William James’ own story is both interesting and instructive. As a young man he had been at somewhat of a loss to decide what to do with himself. Having briefly trained as an artist, he quickly realised that he’d never be good enough and became disillusioned with the idea, declaring that “there is nothing on earth more deplorable than a bad artist”. He afterwards retrained in chemistry, enrolling at Harvard in 1861 (a few months after the outbreak of the American Civil War), but restless again, twelve months or so later, transferred to biology. Still only twenty-one, James already felt that he was running out of options, writing in a letter to his cousin:

“I have four alternatives: Natural History, Medicine, Printing, Beggary. Much may be said in favour of each. I have named them in the ascending order of their pecuniary invitingness. After all, the great problem of life seems to be how to keep body and soul together, and I have to consider lucre. To study natural science, I know I should like, but the prospect of supporting a family on $600 a year is not one of those rosy dreams of the future with which the young are said to be haunted. Medicine would pay, and I should still be dealing with subjects which interest me – but how much drudgery and of what an unpleasant kind is there!”

Three years on, James entered the Harvard Medical School, where he quickly became disillusioned again. Certain that he no longer wished to become a practicing doctor, and being more interested in psychology and natural history than medicine, a new opportunity then arose, so he set sail to the Amazon in hopes of becoming a naturalist. However, the expedition didn’t work out well either. Fed up with collecting bugs and bored with the company of his fellow explorers, to cap everything, he fell quite ill. Although desperate to return home, he was obliged to continue, and, slowly he regained his strength, deciding that in spite of everything it had been a worthwhile diversion; no doubt heartened too by the prospect of finally returning home.

It was 1866, when James next resumed medical studies at Harvard although the Amazon adventure had left him physically and (very probably) psychologically weakened; a continuing sickness that forced James to break off from his studies yet again. Seeking rest and recuperation, for the next two years James sojourned in Europe, where, to judge from his own accounts, he again experienced a great deal of isolation, loneliness and boredom. Returning to America at the end of 1868 – now approaching twenty-seven years old – he picked up his studies at Harvard for the last time, successfully passing his degree to become William James M.D. in 1869.

Too weak to find work anyway, James stayed resolute in his unwillingness to become a practicing doctor. For a prolonged period, he did nothing at all, or next to nothing. Three years passed when, aside from the occasional publication of articles and reviews, he devoted himself solely to reading books or thinking thoughts, and often gloomy ones. Then, one day, he had a quite miraculous revelation: a very dark revelation that made him suddenly exceedingly aware of his own mental fragility:

“Whilst in this state of philosophic pessimism and general depression of spirits about my prospects, I went one evening into the dressing room in the twilight… when suddenly there fell upon me without any warning, just as if it came out of the darkness, a horrible fear of my own existence. Simultaneously there arose in my mind the image of an epileptic patient whom I had seen in the asylum, a black-haired youth with greenish skin, entirely idiotic, who used to sit all day on one of the benches, or rather shelves, against the wall, with his knees drawn up against his chin, and the coarse gray undershirt, which was his only garment, drawn over them, inclosing his entire figure. He sat there like a sort of sculptured Egyptian cat or Peruvian mummy, moving nothing but his black eyes and looking absolutely non-human. This image and my fear entered into a species of combination with each other. That shape am I, I felt, potentially. Nothing that I possess can defend me against that fate, if the hour for it should strike for me as it struck for him. There was such a horror of him, and such a perception of my own merely momentary discrepancy from him, that it was as if something hitherto solid within my breast gave way entirely, and I became a mass of quivering fear. After this the universe was changed for me altogether. I awoke morning after morning with a horrible dread at the pit of my stomach, and with a sense of the insecurity of life that I never knew before, and that I have never felt since. It was like a revelation; and although the immediate feelings passed away, the experience has made me sympathetic with the morbid feelings of others ever since.” 5

Having suffered what today would very likely be called ‘a nervous breakdown’, James was forced to reflect on the current theories of the mind. Previously, he had accepted the materialist ‘automaton theory’ – that our ability to act upon the world depends not upon conscious states as such, but upon the brain-states that underpin and produce them – but now he felt that if true this meant he was personally trapped forever in a depression that could only be cured by some kind of physical remedy. With no such remedy forthcoming he was forced, however, to tackle his own disorder by further introspection and self-analysis.

James read more and thought more since there was nothing else he could do. Three more desperately unhappy years would pass before he had sufficiently recuperated to rejoin the ordinary world, accepting an offer to become lecturer in physiology at Harvard. But as luck would have it, teaching suited James. He enjoyed the subject of physiology itself, and found the activity of teaching “very interesting and stimulating”. James had, for once, landed on his feet, and his fortunes were also beginning to improve in other ways.

Enjoying the benefits of a steady income for the first time in his life, he was soon to meet Alice Gibbons, the future “Mrs W.J.” They married two years later in 1878. She was a perfect companion – intelligent, perceptive, encouraging, and perhaps most importantly for James, an organising force in his life. He had also just been offered a publishing contract to write a book on his main specialism, which was by now – and in spite of such diversity of training – most definitely psychology. With everything now in place, James set to work on what would be his magnus opus. Wasting absolutely no time whatsoever, the opening chapters were drafted while still on their honeymoon.

“What is this mythological and poetical talk about psychology and Psyche and keeping back a manuscript composed during honeymoon?” he wrote in jest to the taunts of a friend, “The only psyche now recognized by science is a decapitated frog whose writhings express deeper truths than your weak-minded poets ever dreamed. She (not Psyche but the bride) loves all these doctrines which are quite novel to her mind, hitherto accustomed to all sorts of mysticisms and superstitions. She swears entirely by reflex action now, and believes in universal Nothwendigkeit. [determinism]” 6

It took James more than a decade to complete what quickly became the definitive university textbook on the subject, ample time for such ingrained materialist leanings to have softened. For the most part sticking to what was directly and consciously known to him, his attempts to dissect the psyche involved much painstaking introspection of what he famously came to describe as his (and our) “stream of consciousness”. Such close analysis of the subjective experience of consciousness itself had suggested to James the need to distinguish between “the Me and the I” as separate component parts of what in completeness he called “the self”. 7 In one way or another, this division of self into selves, whether these be consciously apprehensible or not, has remained a theoretical basis of all later methods of psychoanalysis.

There is a joke that Henry James was a philosopher who wrote novels, whereas his brother William was a novelist who wrote philosophy. But this does WJ a disservice. James’ philosophy, known as pragmatism, is a later diversion. Unlike his writings about psychology, which became the standard academic texts, as well as popular best-sellers (and what better tribute to James’s fluid prose); his ideas on pragmatism were rather poorly received. But then James was a lesser expert in philosophy, a situation not helped by his distaste for logical reasoning; and he would be better remembered for his writings on psychology, a subject in which he excelled. Freud’s claim to originality is nothing like as foundational.

For James had been at the vanguard just as psychology was pulling irreparably apart from the grip that philosophy had held so long (which explains why James was notionally Professor of Philosopher at the time he was writing), to be grafted back again as a part of biology. For this reason, and regardless that James remained as highly critical of the developing field of experimental psychology, as he was of the deductive reasoners on both sides of the English Channel – the British Empiricists of Locke and Hume, and the continental giants Leibnitz, Kant and Hegel – to some of his contemporaries, James’s view appeared all too dangerously materialistic. If only they could have seen how areas of psychology were to so ruinously develop, they would have appreciated that James was, as always, a moderate.

*

Whereas James would remain an academic throughout his life, Freud, though briefly studying zoology at the University of Vienna, one month spent unsuccessfully searching for the gonads of the male eel 8, turned next to neurology but then decided to return to medicine and open his own practice. Freud also received expert training in the new-fangled techniques of hypnosis.

‘Hypnosis’ comes from the Greek hupnos and means, in effect, “artificial sleep”. To induce hypnosis, a patient’s conscious mind needs to be distracted briefly. Achieving this brings communion with a part of the patient’s mind which appears to be something other than the usual conscious state. This other was bound to be given a name, and so it is really not surprising at all that the terms “sub-conscious” and “unconscious” had been in circulation already and prior to the theories of Freud or James. But named or otherwise, mysterious evidence of the unconscious has always been known. Dreams, after all, though we consciously experience them, are neither consciously conceived nor willed. They just pop out from nowhere – or from “the unconscious”.

And Freud soon realised that there were better routes to the unconscious than hypnosis. For instance, he found that it was just as effective to listen to his patients, or if their conscious mind was still unwilling to give up some of its defences, to allow their free association of words and ideas. He also looked for unconscious connections within his patients’ dreams, gradually uncovering, what he came to believe were the deeply repressed animalistic drives that governed their fears, attitudes and behaviour. Having found the unconscious root to their problems, the patient might at last begin to consciously grapple with it. It was a technique that apparently worked, with many of Freud’s patients recovering from the worst effects of their neuroses and hysteria, and so “the talking cure” became a lasting part of Freud’s legacy. You lay on the couch, and just out of sight, Freud listened and interpreted.

But Freud also left a bigger mark, by helping to shape the way we see ourselves. Drawing directly on his experiences as doctor, he had slowly excavated, as he found it, the human unconscious piece by piece: and quite aside from aspects he labelled the superego and the id, Freud claimed to have discovered the existence of the libido: a primary, sexual drive that operated beneath our conscious awareness, prompting our desires for pleasure and avoidance of pain regardless of whether these desires conflicted with ordinary social conventions. Freud discerned a natural process of psychological development 9 and came to believe that whenever this process is arrested or, more generally, whenever normal instinctual appetites are consciously repressed, then lurking deep within the unconscious, the deeply repressed desires will automatically resurface in more morbid forms. This was, he determined, the common root cause for all of his patient’s various symptoms and illnesses.

Had Freud stopped there, then I feel that his contribution to psychology would have been fully commendable, for there is considerable truth in what he is telling us. He says too much no doubt (especially when it comes to the specifics of human development), but he also says something that needed to be said most urgently: that if you force people to behave against their natures then you will very likely make them sick. However, Freud took his ideas a great deal further.

And so we come to the ‘Oedipus complex’, which of the many Freudian features of our supposed psychological nether regions, is without doubt the one of greatest notoriety. The myth of Oedipus is a fascinating one in which the hero by his encounters is compelled to deal with fate, misfortune and prophesy. 10 But Freud finds in this tale, a revelation of deep and universal unconscious repression:

“[Oedipus’s] destiny moves us only because it might have been ours – because the Oracle laid the same curse upon us before our birth as upon him. It is the fate of all of us, perhaps, to direct our first sexual impulse towards our mother and our first hatred and our first murderous wish against our father. Our dreams convince us that this is so.” 11

Freud generally studied those with minor psychological problems, determining on the basis of an unhappy few, what he presumed true for healthier individuals too, and this is perhaps a failure of all psychoanalytic theories. It seems odd that he came to believe in the universality of the Oedipus Complex, but then who can doubt that he didn’t suffer from it greatly? Perhaps he also felt a ‘castration anxiety’ as a result of the Oedipal rivalry he’d had with his own father. Maybe he even experienced “penis envy”, if not of the same intensity as he said he detected in his female patients, but of a compensatory masculine kind. After all, such unconscious ‘transference’ of attitudes and feelings from one person to another – from patient onto the doctor, or vice versa in this relevant example – is another concept that Freud was first to identify and label.

*

Given the prudish age in which Freud had fleshed out his ideas, it seems surprising perhaps how swiftly these theories received any widespread acceptance and acclaim, although I can think of two good reasons why Freudianism took hold. The first is straightforward: that society had been very badly in need of a dose of Freud, or something very like Freud. After so much repression, the pendulum was bound to swing the other way. But arguably the more important reason – indeed the reason his theories have remained influential – is that Freud picked up the baton directly from where Darwin left off. Restricting his explanations to biological instincts and drives, Freudianism has the guise of scientific legitimacy, and this was a vital determining factor that helped to secure its prominent position within the modern epistemological canon.

Following his precedent, students of Freud, most notably Carl Jung and Alfred Adler, also drew on clinical experiences with their own patients, but gradually came to the conclusion, for different reasons, that Freud’s approach was too reductionist, and that there is considerably more to a patient’s mental well-being than healthy appetites and desires, and thus more to the psychological underworld than matters of sex and death.

Where Freud was a materialist and an atheist, Jung went on to incorporate aspects of the spiritual into his extended theory of the unconscious, though he remained respectful to biology and keen to anchor his own theories upon an evolutionary bedrock. Jung nevertheless speculates following a philosophical tradition that owes much to Kant, while also drawing heavily on personal experience, and comes to posit the existence of psychical structures he calls ‘archetypes’ operating at the deepest levels within a collective unconscious; a shared characteristic due to our common ancestry.

Thus he envisions ‘the ego’ – the aspect of our psyche we identify as “I” – as existing in relation to an unknown and finally unknowable sea of autonomous entities which have their own life. Jung actually suggests that Freud’s Oedipus complex is just one of these archetypes, while he finds himself drawn by the bigger fish of the unconscious beginning with ‘The Shadow’ – what is hidden and rejected by the ego – and what he determines are the communicating figures of ‘Anima/Animus’ (he also calls these ‘The Syzygy’) that prepare us for incremental and never-ending revelations of our all-encompassing ‘Self’. This lifelong psychical development, or ‘individuation’, was seen by Jung an inherently religious quest and he is unapologetic in proclaiming so; the religious impulse being simply another product of our evolutionary development along with opposable thumbs and walking upright. However, rather than a mere vestigial hangover, religion is, for Jung, fundamental to the deep nature of our species.

Unlike Freud, Jung was also invested in understanding how the human psyche varies greatly from person to person, and to these ends introduced new ideas about character types, adding ‘introvert’ and ‘extrovert’ to the psychological lexicon to draw a clear division between individuals characterised either by primarily subjective or objective orientations to life – an introvert himself, Jung was able to see a distinction. Meanwhile, Adler was more concerned with our social identities, and why people felt – in very many cases quite irrationally – inferior or superior to others. Adler’s efforts culminated in the development of a theory of the ‘inferiority complex’ – which might also be thought of as an aspect of the Jungian ‘Shadow’.

These different schools of psychoanalysis are not irreconcilable. They are indeed rather complimentary in many ways: Freud tackling the animal craving and want of pleasure; Jung looking for expression above and beyond what William Blake once referred to as “this vegetable world”; and Adler delving most directly into the mud of human relations, and the lasting effects of personal trauma associated with social inequalities.

Freud presumes that since we are biological products of Darwinian evolution, then our minds have been evolutionarily pre-programmed. Turning the same inquiry outward, Jung goes in a search of common symbolic threads within mythological and folkloric traditions, enlisting these as evidence for the psychological archetypes buried deep within us all. And though Jung held no orthodox religious views of his own, he felt perfectly comfortable drawing upon religious (including overtly Christian) symbolism. In one of his most contemplative passages, Jung wrote:

Perhaps this sounds very simple, but simple things are always the most difficult. In actual life it requires the greatest art to be simple, and so acceptance of oneself is the essence of the moral problem and the acid test of one’s whole outlook on life. That I feed the beggar, that I forgive an insult, that I love my enemy in the name of Christ—all these are undoubtedly great virtues. What I do unto the least of my brethren, that I do unto Christ.

But what if I should discover that the least amongst them all, the poorest of all beggars, the most impudent of all offenders, yea the very fiend himself—that these are within me, and that I myself stand in need of the alms of my own kindness, that I myself am the enemy who must be loved—what then? Then, as a rule, the whole truth of Christianity is reversed: there is then no more talk of love and long-suffering; we say to the brother within us “Raca,” and condemn and rage against ourselves. We hide him from the world, we deny ever having met this least among the lowly in ourselves, and had it been God himself who drew near to us in this despicable form, we should have denied him a thousand times before a single cock had crowed. 12

Of course, “the very fiend himself” is the Jungian ‘Shadow’, the contents of which without recognition and acceptance then inevitably remain repressed, causing our thoughts and actions to be governed unconsciously and for these unapproachable and rejected aspects of our own psyche to be projected out on to the world. ‘Shadow projection’ onto others fills the world with enemies of our own imagining; and this, Jung believed, was the root of nearly all evil. Alternatively, by taking Jung’s advice and accepting “that I myself am the enemy who must be loved”, we come back to ourselves in wholeness, and then not only does the omnipresent threat of the Other diminish, but the veil of illusion between the ego and reality is thinned, with access to previously hidden strengths further enabling us to reach our fuller potential. 13

Today there are millions doing “shadow work” as it is now popularly known: self-help exercises often combined with traditional practices of yoga, meditation or the ritual use of entheogens: so here is a new meeting place – a modern mash-up – of religion and psychotherapy. Quietly and individually, a shapeless movement has arisen almost spontaneously as a reaction to the peculiar rigours of western civilisation. Will it change the world? For better or worse, it already has.

Alan Watts who is best known for his Western interpretations of Eastern spiritual traditions and in particular Zen Buddhism and Daoism, here reads this same influential passage from one of Jung’s lectures in which he speaks of ending “the inner civil war”:

*

Now what about my joke at the top? What’s that all about? Indeed, and in all seriousness, what makes it a joke at all? Well, not wishing to delve deeply into theories of comedy, there is one structure that arises repeatedly and nearly universally: that the punch line to every joke relies on some kind of unexpected twist on the set up…

“Why did the chicken cross the road?” Here we find an inherent ambiguity that lies within us of the word “why” and this is what sets up the twist. However, in the case of the joke about the psychiatrist and the man who thinks he’s a moth, the site of ambiguity isn’t so obvious. But here the humour I think comes down to our different and conflicting notions of ‘belief’.

A brief digression then: What is belief? To offer a salient example, when someone tells you “I believe in God”, what are they intending to communicate? No less importantly, what would you take them to mean? Put differently, atheists will very often say “I don’t believe in anything” – so again, what are they (literally) trying to convey here? And what would a listener take them to mean? Because in all these instances the same word is used to describe similar but distinct attitudinal relationships to reality, so it is easy to presume that everyone is using the word in precisely the same way, which immediately becomes less certain once we acknowledge that the word “belief” actually carries two quite distinct meanings.

According to the first definition, it is “a mental conviction of the truth of an idea or some aspect of reality”. Belief in UFOs fits this criterion, as does a belief in gravity and that the sun will rise again tomorrow. How about belief in God? When late in life Jung was asked if he believed in God, he replied straightforwardly “I know”. 14 Others reply with just the same degree of conviction when asked about angels, fairies, spirit guides, ghosts or the power of healing and crystals. As a physicist, I believe in the existence of atoms, electrons and quarks – although I’ve never “seen one”, like Jung I know! And belief in this sense is more often than not grounded in a person’s direct experience/s which obviously doesn’t go to validate the objective truth of their belief. He saw a ghost. She was healed by the touch of a holy man. We ran experiments to measure the charge on an electron. Again, in this sense I have never personally known of anyone who did not believe in the physical reality of a world of solid objects – for who doesn’t believe in tables and chairs? In this important sense everyone has many convictions about the truth of reality, and we surely all believe in something – this applies even in the case of the most hardline of atheists!

But there is also a second kind of belief: “of an idea that is believed to be true or valid without positive knowledge.” The emphasis here is on the lack of knowledge or indeed of direct experience. So this belief involves an effort of willing on the part of the believer. In many ways, this is to believe in make-believe, or we might just say “to make-believe”: to pretend or wish that something is real. I believe in unicorns…

As a child all religion was utterly mystifying to me since what was clearly make-believe was for reasons I couldn’t comprehend being held up as sacrosanct. Based on my fleeting encounters with Christianity, it also seemed evident that the harder you tried to make-believe in this maddening mystification of being, the better a person it made you! So when someone says they believe in God, is this all they actually mean? That they are trying very hard to make-believe in impossibility? Indeed, is this striving alone mistaken not only as virtuous but as actual believing in the first sense? Yes, quite possibly – and not only for religious types. As Kurt Vonnegut wrote in the introduction of his novel Mother Night: “This is the only story of mine whose moral I know”, continuing: “We are what we pretend to be, so we must be careful about what we pretend to be.” 15

Alternatively, it may be that someone truly believes in God – or whatever synonym they choose to approximate to ‘cosmic higher consciousness’ – with the same conviction that all physicists believe in gravity and atoms. They may come to know ‘God’, as Jung did.

Now back to the joke and apologies for killing it: The man complains that he feels like a moth and this is so silly that we automatically presume his condition is entirely one of make-believe. But then the twist, when we learn that he is acting according to his conviction, which means he has true belief of the first kind. Here’s my hunch then for why we find this funny: it spontaneously reminds us of how true beliefs – rather than make-believe – both inform reality as we perceive it, and fundamentally shape our behaviour. So here we come back to Vonnegut’s moral. Yet we are always in the process of forgetting altogether that this is how we also live, until abruptly the joke reminds us again – and in our moment of recollecting, spontaneously we laugh.

*

And the joke was hilarious wasn’t it? No, you didn’t like it….? Well, if beauty is in the eye of the beholder, comedy surely lies in the marrow of the funny bone! Which brings me to ask why there is comedy? More broadly, why is there laughter – surely the most curious human reflex of all – or its very closely-related reflex cousin, crying. In fact, the emission of tears from the nasolacrimal ducts not in response to irritation of our ocular structures but purely for reasons of joy or sorrow is a very nearly uniquely human secretomotor phenomenon. (Excuse my Latin!) 16

The jury is still out on evolutionary function of laughing and crying, but when considered in strictly Darwinian terms (as current Science insists), it is hard to fathom why these dangerously debilitating and a potentially life threatening responses ever developed in any species. It is acknowledged indeed that a handful of unlucky (perhaps lucky?) people have literally died from laughter. So why do we laugh? Why do we love laughter, whether ours or others, so much? Your guess is as good as mine, and, more importantly, as good as Darwin’s.

Less generously, Thomas Hobbes, who explained all human behaviour in terms of gaining social advantage, wrote that:

Joy, arising from imagination of a man’s own power and ability, is that exultation of the mind which is called glorying… Sudden Glory, is the passion which maketh those grimaces called LAUGHTER; and is caused either by some sudden act of their own, that pleaseth them; or by the apprehension of some deformed thing in another, by comparison whereof they suddenly applaud themselves. 17

And yes, it is true that a great deal of laughter is at the expense of some butt of our joking, however not all mockery involves an inflicted party and there’s a great deal more to humour and laughter than merely ridicule and contempt. So Hobbes’ account is at best a very desiccated postulation for why humans laugh let alone what constitutes joy.

Indeed, Hobbes’ reductionism is evidently mistaken and misinformed not only by his deep-seated misanthropy, but also by a seeming lack of common insight which leads one to suspect that when it came to sharing any jokes, he just didn’t get it. But precisely what didn’t he get?

Well, apparently he didn’t get how laughter can be a straightforward expression of joie de vivre. Too French I imagine! Or that when we apprehend anything and find it amusing, this momentarily snaps us from a prior state of inattention and on the occasion of finding joy in an abrupt, often fleeting, but totally fresh understanding, the revelation itself may elicit laughter (as I already outlined above). Or that it is simply impossible to laugh authentically or infectiously unless you not only understand the joke, but fully acknowledge it. In this way, humour, if confessional, may be liberating at a deeply personal level, or if satirical, liberating at a penetrating societal level. Lastly (in my limited rundown), humour serves as a wonderfully efficient and entertaining springboard for communicating insight and understanding, especially when the truths are dry, difficult to grasp or otherwise unpalatable. Here is a rhetorical economy that Hobbes might actually have approved were it not for his somewhat curmudgeonly disposition.

And why tell a joke here? Just to make you laugh and take your mind off the gravity of the topics covered and still more grave ones to come? To an extent, yes, but also to broaden out our discussion, letting it drift off into related philosophical avenues. For existence is seemingly absurd, is it not? Considered squarely, full-frontal, what’s it all about…? And jokes – especially ones that work beyond rational understanding – offer a playful recognition of the nonsensicalness of existence and of our species’ farcical determination to comprehend it and ourselves fully. What gives us the gall to ever speculate on the meaning of life, the universe and everything?

Meanwhile, we are free to choose: do we laugh or do we cry at our weird predicament. Both responses are surely sounder than cool insouciance, since both are flushed with blood. And were we madder, we might scream instead, of course, whether in joy or terror.

French existentialist Albert Camus famously made the claim: “There is but one truly serious philosophical problem and that is suicide.” 18 Camus was not an advocate of suicide, of course; far from it. In fact, he saw it as perfectly vain attempt to flee from the inescapable absurdity of life, something he believed we ought to embrace in order to live authentically. Indeed, Camus regarded every attempt to deny the primacy of ultimate meaninglessness of life in a universe that is indifferent to our suffering as a surrogate form of psychological suicide.

But rather than staring blankly into the abyss, Camus urges us to rebel against it. To face its absurdity without flinching, and through rebellion, by virtue of which we individually reconstruct the meaning of our lives afresh, albeit paradoxically, we shall then come to face extreme rationality. Although perhaps he goes too far, and reaches a point so extreme that few can follow: such a Sisyphean outlook being too desolate for most of us, and his exhortation to authenticity so impassioned that it seems almost infinitely taxing. 19 Kierkegaard’s “leap of faith” is arguably more forgiving of the human condition – but enough of philosophy. 20

This pause is meant for introspection. I have therefore presented an opportunity to reconsider how my interlude set out, not only by telling a joke – and hopefully one that made you smile if not laugh out loud – but also to reflect upon the beautiful wisdom encapsulated in Chuang Tzu’s dream of becoming a butterfly; mystical enlightenment from 4th century BC China, that in turn clashes against the plain silliness of a doctor-doctor joke about a moth-man; a surreal quip that also brushes on the use of clinical diagnosis in modern psychiatry (something I shall be coming to consider next).

The running theme here is one of transformation, and at the risk of also killing Chuang Tzu’s message by dissection, I will simply add (unnecessarily from the Daoist perspective) that existence does appear to be cyclically transformative; at both personal and collective levels, the conscious and unconscious, spiralling outwards – whether upward into light or downward into darkness – each perpetually giving rise to the other just like the everblooming of yang and yin. As maverick clinical psychiatrist R. D. Laing once wrote:

“Most people most of the time experience themselves and others in one way or another that I… call egoic. That is, centrally or peripherally, they experience the world and themselves in terms of a consistent identity, a me-here over against you-there, within a framework of certain ground structures of space and time shared with other members of their society… All religious and all existential philosophies have agreed that such egoic experience is a preliminary illusion, a veil, a film of maya—a dream to Heraclitus, and to Lao Tzu, the fundamental illusion of all Buddhism, a state of sleep, of death, of socially accepted madness, a womb state to which one has to die, from which one has to be born.” 21

Returning from the shadowlands of alienation to contemplate again the glinting iridescent radiance of Tzu’s butterfly’s wings is invitation to scrape away the dross of habituated semi-consciousness that veils the playful mystery of our minds. On a different occasion, Tzu wrote:

One who dreams of drinking wine may in the morning weep; one who dreams weeping may in the morning go out to hunt. During our dreams we do not know we are dreaming. We may even dream of interpreting a dream. Only on waking do we know it was a dream. Only after the great awakening will we realize that this is the great dream. And yet fools think they are awake, presuming to know that they are rulers or herdsmen. How dense! You and Confucius are both dreaming, and I who say you are a dream am also a dream. Such is my tale. It will probably be called preposterous, but after ten thousand generations there may be a great sage who will be able to explain it, a trivial interval equivalent to the passage from morning to night. 22

Thus the world about us is scarcely less a construct of our imagination than our dreams are, deconstructed by the senses then seamlessly reconstructed in its entirety. And not just reconfigured via inputs from the celebrated five gateways of vision, sound, touch, taste and smell, but all portals including those of memory, intuition, and even reason. After all, it is curious how we speak of having ‘a sense’ of reason, just as we do ‘a sense’ of humour. Well, do we have… a sense of reason and a sense of humour? If you have followed this far then I sense you may share my own.

Next chapter…

*

Richard Rohr is a Franciscan priest, author and teacher, who says that his calling has been “to retrieve and re-teach the wisdom that has been lost, ignored or misunderstood in the Judeo-Christian tradition.” Rohr is the founder of the Center for Action and Contemplation and academic dean of the CAC’s Living School, where he practises incarnational mysticism, non-dual consciousness, and contemplation, with a particular emphasis on how these affect the social justice issues of our time. Recently he shared his inspirational standpoint in an hour-long chat with ‘Buddha at the Gas Pump’s Rick Archer:

*

Addendum: anyone with half a brain

“The intuitive mind is a sacred gift and the rational mind is a faithful servant. We have created a society that honors the servant and has forgotten the gift.”

— attributed to Albert Einstein 23

*

The development of split-brain operations for the treatment of severe cases of epilepsy, which involves the severing of the corpus callosum, a thick web of nerves that allow communication between the two hemispheres, first drew attention to how left and right hemispheres have quite different attributes. Unfortunately, the early studies in this field produced erroneous since superficial notions about left and right brain functions that were in turn vulgarised and popularised when they percolated down into pop psychology and management theory. The left brain was said to generate language and logic; while it was only the right brain which supposedly dealt with feelings and was the creative centre. In reality, both hemispheres are involved in all aspects of cognition, and as a consequence the study of what is technically called the lateralisation of brain function fell to some extent into academic disrepute.

In fact, important differences do occur between the specialism of the left and right hemispheres, although as psychiatrist Iain McGilchrist proposes in this book The Master and His Emissary (which he sees as the proper roles of the right and left hemispheres respectively) 24, it is often better to understand the distinctions in terms of where conscious awareness is placed. In summary, the left hemisphere attends to and focuses narrowly but precisely on what is immediately in front of you, allowing you to strike the nail with the hammer, thread the eye of the needle, sort the wheat from the chaff (or whatever activity you might be actively engaged with), while the right hemisphere remains highly vigilant and attentive to the surroundings. Thus, the left brain operates tools and usefully sizes up situations, while the right brain’s immediate relationship to the environment and to our bodies makes it the mediator to social activities and to a far broader conscious awareness. However, according to McGilchrist, the left brain is also convinced of its primacy, whereas the right is incapable of comprehending such hierarchies, which is arguably the root of a problem we all face, since it repeatedly leads humans to construct societal arrangements and norms in accordance with left brain dominance and so to the inevitable detriment of less restricted right brain awareness.

Supported by many decades of research, this has become the informed view of McGilchrist, and given that his overarching thesis has merit – note that the basic distinctions between left and right brain awareness are uncontroversial and well understood in psychology, whereas what he sees as the socio-historical repercussions is more speculative – then it raises brain function lateralisation as major underlying issue that needs to be incorporated in any final appraisal of ‘human nature’, the implications of which McGilchrist propounds at length in his own writing. In the preface to the new expanded edition of The Master and His Emissary (2009), he writes:

I don’t want it to be possible, after reading this book, for any intelligent person ever again to see the right hemisphere as the ‘minor’ hemisphere, as it used to be called – still worse the flighty, impetuous, fantastical one, the unreliable but perhaps fluffy and cuddly one – and the left hemisphere as the solid, dependable, down-to-earth hemisphere, the one that does all the heavy lifting and is alone the intelligent source of our understanding. I might still be to some extent swimming against the current, but there are signs that the current may be changing direction.

*

Embedded below is a lecture given to the Royal Society of Arts (RSA) in 2010, in which he offers a concise overview of how according to our current understanding, the ‘divided brain’ has profoundly altered human behaviour, culture and society:

To hear these ideas contextualised within an evolutionary account of brain laterality, I also recommend a lecture given to The Evolutionary Psychiatry Special Interest Group of the Royal College of Psychiatrists in London (EPSIG UK) in 2018:

For more from Iain McGilchrist I also recommend this extended interview with physicist and filmmaker Curt Jaimungal, host of Theories of Everything, which premiered on March 29th:

*

Please note that for the purposes of ‘publishing’ here I have taken advantage of the option to incorporate hypertext links and embed videos – in order to distinguish additional commentary from the original text all newly incorporated text has been italised.

*

1 Quoted from the book known as Zhuangzi (also transliterated as Chuang Tzu or Chuang Chou). Translation by Lin Yutang

2 “insensible perceptions are as important to [the science of minds, souls, and soul-like substances] as insensible corpuscles are to natural science, and it is just as unreasonable to reject the one as the other on the pretext that they are beyond the reach of our senses.” from Preface of New Essays concerning Human Understanding  by Gottfried Leibnitz, first published in 1704, translation courtesy of Stanford Encyclopedia of Philosophy.

3 From Anthropology from a Pragmatic Point of View by Immanuel Kant, first published in 1798.

4 “The definition of Psychology may be best given… as the description and explanation of states of consciousness as such. By states of consciousness are meant such things as sensations, desires, emotions, cognitions, reasonings, decisions, volitions, and the like. Their ‘explanation’ must of course include the study of their causes, conditions, and immediate consequences, so far as these can be ascertained.” from opening paragraph of “Introduction: Body and Mind” from The Principles of Psychology, by William James, first published in 1892.

5 Extract taken from The Varieties of Religious Experience, from chapter on “The Sick Soul”.

6 Letter to his friend, Francis Child.

7 According to James, the first division of “the self” that can be discriminated is between “the self as known”, the me, and “the  self as knower”, the I, or “pure ego”. The me he then suggests might be sub-divided in a constituent hierarchy: “the material me” at the lowest level, then “the social me” and top-most “the spiritual me”. It was not until very much later in the 1920s when Freud had fully developed his own tripartite division of the psyche in id, ego and super-ego, a division that surely owes much to James.

8

In the spring of 1876, a young man of nineteen arrived in the seaside city of Trieste and set about a curious task. Every morning, as the fishermen brought in their catch, he went to meet them at the port, where he bought eels by the dozens and then the hundreds. He carried them home, to a dissection table in a corner of his room, and—from eight until noon, when he broke for lunch, and then again from one until six, when he quit for the day and went to ogle the women of Trieste on the street—he diligently slashed away, in search of gonads.

“My hands are stained by the white and red blood of the sea creatures,” he wrote to a friend. “All I see when I close my eyes is the shimmering dead tissue, which haunts my dreams, and all I can think about are the big questions, the ones that go hand in hand with testicles and ovaries—the universal, pivotal questions.”

The young man, whose name was Sigmund Freud, eventually followed his evolving questions in other directions. But in Trieste, elbow-deep in slime, he hoped to be the first person to find what men of science had been seeking for thousands of years: the testicles of an eel. To see them would be to begin to solve a profound mystery, one that had stumped Aristotle and countless successors throughout the history of natural science: Where do eels come from?

From an article entitled “Where Do Eels Come From?” written by Brooke Jarvis, published in New Yorker magazine on May 18, 2020. https://www.newyorker.com/magazine/2020/05/25/where-do-eels-come-from

9 Fixing on specific erogenous zones of the body, Freud believed that libidinous desire shaped our psychological development in a very specific fashion, naturally progressing, if permitted, through early stages from oral, to anal, and, then reaching adulthood, to genital.

10 Jocasta, the queen of Thebes, is barren, and so she and her husband, the king Laius, decide to consult the Oracle of Delphi. The Oracle tells them that if Jocasta bears a son, then the son will kill his father and marry her. Later, when Jocasta does indeed have a son, Laius demands that a servant take the baby to a mountain to be abandoned, his ankles pinned together just in case. But Oracles are rarely mistaken, fate is hard to avoid, and so as it happens the servant spares the infant, giving him to a shepherd instead. Eventually, as fortune will have it, the infant is adopted by the king and queen of Corinth, and named Oedipus because of the swellings on his feet. Years pass. Then, one day Oedipus learns that the king and queen are not his parents, but when he asks them, they deny the truth. So Oedipus decides put the question to the Oracle of Delphi instead, who being an enigmatic type, refuses to identify his true parents, but foretells his future instead, saying that he is destined to kill his father and marry his mother. Determined to avoid this, Oedipus determines not to return home to Corinth, heading to, you guessed it, Thebes instead. He comes to an intersection of three roads and meets Laius driving a chariot. They argue about who has the right of way and then, in an early example of road rage, their rage spills into a fight and thus Oedipus unwittingly kills his real father. Next up, he meets the sphinx, who asks its famous riddle. This is a question of life and death, all who have incorrectly answered having been killed and eaten, but Oedipus gets the answer right and so obligingly the sphinx kills itself instead. Having freed the people of Thebes from the sphinx, Oedipus receives the hand of the recently widowed Jocasta in marriage. All is well for a while, but then it comes to pass that Jocasta learns who Oedipus really is, and hangs herself. Then, later again, Oedipus discovers that he was the murderer of his own father, and gouges his own eyes out.

11 Sigmund Freud, The Interpretation of Dreams, chapter V, “The Material and Sources of Dreams”

12 From an essay by C.G. Jung published in CW XI, Para 520. The word ‘Raca’ is an insult translated as ‘worthless’ or ‘empty’ taken from a passage in the Sermon on the Mount from Matthew 5:22.

13 Jung described the shadow in a key passage as “that hidden, repressed, for the most part inferior and guilt-laden personality whose ultimate ramifications reach back into the realm of our animal ancestors…If it has been believed hitherto that the human shadow was the source of evil, it can now be ascertained on closer investigation that the unconscious man, that is his shadow does not consist only of morally reprehensible tendencies, but also displays a number of good qualities, such as normal instincts, appropriate reactions, realistic insights, creative impulses etc”

From Jung’s Collected Works, 9, part 2, paragraph 422–3.

14 In response to a question in an interview completed just two years before his death by John Freeman and broadcast as part of the BBC Face to Face TV series in 1959. Asking about his childhood and whether he had to attend church, he then asked: “Do you now believe in God?” Jung replies: “Now? Difficult to answer… I know. I don’t need to believe I know.”

15 The quote in full reads: “This is the only story of mine whose moral I know. I don’t think it’s a marvelous moral, I just happen to know what it is: We are what we pretend to be, so we must be careful about what we pretend to be.” From Mother Night (1962) by Kurt Vonnegut.

16 In his follow-up to the more famous On the Origin of Species (1959) and The Descent of Man (1871), Charles Darwin reported in Chapter VI entitled “Special Expressions of Man: Suffering and Weeping” of his three major work The Expression of the Emotions in Man and Animals (1872), that:

I was anxious to ascertain whether there existed in any of the lower animals a similar relation between the contraction of the orbicular muscles during violent expiration and the secretion of tears; but there are very few animals which contract these muscles in a prolonged manner, or which shed tears. The Macacus maurus, which formerly wept so copiously in the Zoological Gardens, would have been a fine case for observation; but the two monkeys now there, and which are believed to belong to the same species, do not weep. Nevertheless they were carefully observed by Mr. Bartlett and myself, whilst screaming loudly, and they seemed to contract these muscles; but they moved about their cages so rapidly, that it was difficult to observe with certainty. No other monkey, as far as I have been able to ascertain, contracts its orbicular muscles whilst screaming.

The Indian elephant is known sometimes to weep. Sir E. Tennent, in describing these which he saw captured and bound in Ceylon, says, some “lay motionless on the ground, with no other indication of suffering than the tears which suffused their eyes and flowed incessantly.” Speaking of another elephant he says, “When overpowered and made fast, his grief was most affecting; his violence sank to utter prostration, and he lay on the ground, uttering choking cries, with tears trickling down his cheeks.” In the Zoological Gardens the keeper of the Indian elephants positively asserts that he has several times seen tears rolling down the face of the old female, when distressed by the removal of the young one. Hence I was extremely anxious to ascertain, as an extension of the relation between the contraction of the orbicular muscles and the shedding of tears in man, whether elephants when screaming or trumpeting loudly contract these muscles. At Mr. Bartlett’s desire the keeper ordered the old and the young elephant to trumpet; and we repeatedly saw in both animals that, just as the trumpeting began, the orbicular muscles, especially the lower ones, were distinctly contracted. On a subsequent occasion the keeper made the old elephant trumpet much more loudly, and invariably both the upper and lower orbicular muscles were strongly contracted, and now in an equal degree. It is a singular fact that the African elephant, which, however, is so different from the Indian species that it is placed by some naturalists in a distinct sub-genus, when made on two occasions to trumpet loudly, exhibited no trace of the contraction of the orbicular muscles.

The full text is uploaded here: https://www.gutenberg.org/files/1227/1227-h/1227-h.htm#link2HCH0006

17 Quote from, Leviathan (1651), The First Part, Chapter 6, by Thomas Hobbes (with italics and spelling as original). Hobbes continues:

And it is incident most to them, that are conscious of the fewest abilities in themselves; who are forced to keep themselves in their own favour, by observing the imperfections of other men. And therefore much Laughter at the defects of others is a signe of Pusillanimity. For of great minds, one of the proper workes is, to help and free others from scorn; and compare themselves onely with the most able.

Interestingly, Hobbes then immediately offers his account of weeping as follows:

On the contrary, Sudden Dejection is the passion that causeth WEEPING; and is caused by such accidents, as suddenly take away some vehement hope, or some prop of their power: and they are most subject to it, that rely principally on helps externall, such as are Women, and Children. Therefore, some Weep for the loss of Friends; Others for their unkindnesse; others for the sudden stop made to their thoughts of revenge, by Reconciliation. But in all cases, both Laughter and Weeping, are sudden motions; Custome taking them both away. For no man Laughs at old jests; or Weeps for an old calamity.

https://www.gutenberg.org/files/3207/3207-h/3207-h.htm#link2H_PART1

18 “Il n’y a qu’un problème philosophique vraiment sérieux : c’est le suicide.” Quote taken from The Myth of Sisyphus (1942) by Camus, Albert.  Translated by Justin O’Brien.

19 In Greek mythology Sisyphus was punished in hell by being forced to roll a huge boulder up a hill only for it to roll down every time, repeating his action for eternity. In his philosophical essay The Myth of Sisyphus (1942) Camus compares this unremitting and unrewarding task of Sisyphus to the lives of ordinary people in the modern world, writing:

“The workman of today works every day in his life at the same tasks, and this fate is no less absurd. But it is tragic only at the rare moments when it becomes conscious.”

In sympathy he also muses on Sisyphus’ thoughts especially as he trudges in despair back down the mountain to collect the rock again. He writes:

“You have already grasped that Sisyphus is the absurd hero. He is, as much through his passions as through his torture. His scorn of the gods, his hatred of death, and his passion for life won him that unspeakable penalty in which the whole being is exerted toward accomplishing nothing. This is the price that must be paid for the passions of this earth. Nothing is told us about Sisyphus in the underworld. Myths are made for the imagination to breathe life into them.”

Continuing:

“It is during that return, that pause, that Sisyphus interests me. A face that toils so close to stones is already stone itself! I see that man going back down with a heavy yet measured step toward the torment of which he will never know the end. That hour like a breathing-space which returns as surely as his suffering, that is the hour of consciousness. At each of those moments when he leaves the heights and gradually sinks toward the lairs of the gods, he is superior to his fate. He is stronger than his rock.

“If this myth is tragic, that is because its hero is conscious. Where would his torture be, indeed, if at every step the hope of succeeding upheld him? The workman of today works everyday in his life at the same tasks, and his fate is no less absurd. But it is tragic only at the rare moments when it becomes conscious. Sisyphus, proletarian of the gods, powerless and rebellious, knows the whole extent of his wretched condition: it is what he thinks of during his descent. The lucidity that was to constitute his torture at the same time crowns his victory. There is no fate that can not be surmounted by scorn.”

You can read the extended passage here: http://dbanach.com/sisyphus.htm

20 Søren Kierkegaard never actually coined the term “leap of faith” although he did use the more general notion of “leap” to describe situations whenever a person is faced with a choice that cannot be fully justified rationally. Moreover, in this instance the “leap” is perhaps better described as a leap “towards” or “into” faith that finally overcomes what Kierkegaard saw as an inherent paradoxical contradiction between the ethical and the religious. However, Kierkegaard never advocates “blind faith”, but instead recognises that faith ultimately calls for action in the face of absurdity.

In Part Two, “The Subjective Issue”, of his 1846 work and impassioned attack against Hegelianism, Concluding Unscientific Postscript to the Philosophical Fragments (Danish: Afsluttende uvidenskabelig Efterskrift til de philosophiske Smuler), which is known for its dictum, “Subjectivity is Truth”, Kierkegaard wrote:

“When someone is to leap he must certainly do it alone and also be alone in properly understanding that it is an impossibility… the leap is the decision… I am charging the individual in question with not willing to stop the infinity of [self-]reflection. Am I requiring something of him, then? But on the other hand, in a genuinely speculative way, I assume that reflection stops of its own accord. Why, then, do I require something of him? And what do I require of him? I require a resolution.”

21 R. D. Laing, The Politics of Experience  (Ballantine Books, N.Y., 1967)

22 Quoted from the book known as Zhuangzi (also transliterated as Chuang Tzu or Chuang Chou). Translation by Lin Yutang

23 Although in all likelihood a reworking of a passage from a book entitled The Metaphoric Mind: A Celebration of Creative Consciousness written by Bob Samples and published in 1976 in which the fuller passage reads [with emphasis added]:

“The metaphoric mind is a maverick. It is as wild and unruly as a child. It follows us doggedly and plagues us with its presence as we wander the contrived corridors of rationality. It is a metaphoric link with the unknown called religion that causes us to build cathedrals — and the very cathedrals are built with rational, logical plans. When some personal crisis or the bewildering chaos of everyday life closes in on us, we often rush to worship the rationally-planned cathedral and ignore the religion. Albert Einstein called the intuitive or metaphoric mind a sacred gift. He added that the rational mind was a faithful servant. It is paradoxical that in the context of modern life we have begun to worship the servant and defile the divine.

24 The book is subtitled The Divided Brain and the Making of the Western World

2 Comments

Filed under « finishing the rat race »

keep taking the tablets

The following article is Chapter Four of a book entitled Finishing The Rat Race which I am posting chapter by chapter throughout this year. Since blog posts are stacked in a reverse time sequence (always with the latest at the top), I have decided that the best approach is to post the chapters in reverse order.

All previously uploaded chapters are available (in sequence) by following the link above or from category link in the main menu, where you will also find a brief introductory article about the book itself and why I started writing it.

*

“Psychiatry could be, or some psychiatrists are, on the side of transcendence, of genuine freedom, and of true human growth. But psychiatry can so easily be a technique of brainwashing, of inducing behaviour that is adjusted, by (preferably) non-injurious torture. In the best places, where straitjackets are abolished, doors are unlocked, leucotomies largely forgone, these can be replaced by more subtle lobotomies and tranquillizers that place the bars of Bedlam and the locked doors inside the patient.”  

— R. D. Laing in a later preface to The Divided Self. 1

*

A few notes of caution before proceeding:

From this point onwards I shall use the words ‘madness’ and ‘insanity’ interchangeable and to denote mental illness of different kinds in an entirely general and overarching way. Beyond the shorthand, I have adopted this approach for two principle reasons. Given the nature of the field and on the basis of historical precedent, technical labels tend to be transitory and superseded, and so traditional and non-technical language avoids our need to grapple with the elaborate definitions found in medical directories of psychiatry (more later), while taking this approach also keeps clear of the euphemism treadmill. Moreover, the older terms have simplicity which, if used with sensitivity, bestow weight on the day-to-day misery of mental illness and dignify its suffering. R. D. Laing, who spent a lifetime treating patients with the most severe schizophrenia, unflinching talked about ‘madness’. A flawed genius, I return to Laing in the final section of the chapter.

The second point I wish to highlight is that illnesses associated with the workings of the mind, will sadly, but in all likelihood, remain a cause for social prejudice and discrimination. In part this is due to the detrimental effect mental illness has on interpersonal relationships. And since ‘the person’ – whatever this entity can be said to fully represent – is presupposed to exist in a kind of one-to-one equivalence to the mind, it is basically taken for granted not only that someone’s behaviour correlates to unseen mental activity, but that it is an expression of a person’s character. Indeed, person, mind and behaviour are usually apprehended as a sort of coessential three-in-one.

All forms of suffering are difficult to face, of course, for loved ones as for the patient; however our degree of separation becomes heightened once someone’s personality is significantly altered through illness. I contend however that beyond these often practical concerns, there are further barriers that lie in the way of our full acceptance of mental illness, ones automatically instilled by everyday attitudes and opinions that may cause us to register a greater shock when faced with the sufferings of an unsound mind; some features of the disease not just directly clashing with expectations of acceptable human behaviour, but threatening on occasion to fundamental notions of what it means to be human.

For these reasons mental illness tends to isolate its victims. Those who in all likelihood are suffering profound existential detachment becoming further detached from ordinary human contact. In extreme circumstances, mental illness makes its victims appear as monstrosities – the freaks who ordinary folks once visited asylums simply to gawp at when it only cost a shilling to see “the beasts” rave at Bedlam, as London’s Bethlem Royal Hospital was once known. 2 Whom the gods would destroy they first make mad, the ancient saying goes 3, and it is difficult indeed to conjure up any worse fate than this.

*

Before returning to the main issues around mental illness, I wish to briefly consider the changing societal attitudes toward behaviour in general. The ongoing trend for many decades has been for society to become more tolerant of alternative modes of thinking and acting. Indeed, a plethora of interpersonal norms have either lapsed altogether or, are now regarded as old-fashioned and outmoded, with others already in the process of slow abandonment. For successive generations, the youth has looked upon itself as more liberated than its parents’ generation which it then regards, rightly or wrongly, as repressive and rigid.

To cite a rather obvious example, from the 1950s onwards sex has been gradually and almost totally unhitched from marriage and commensurate with this detachment there is more and more permission – indeed encouragement – to be sexual experimental: yesterday’s magnolia has been touched up to include a range of fifty thousand shades of grey! 4 But the zone of the bedroom is perhaps the exception rather than the rule, and outside its liberally sanctioned walls much that was seen as transgressive remains so and in fact continues to be either prohibited by law or else proscribed by customs or just ‘plain common sense’ – thus we are constrained by restrictions sometimes imposed for perfectly sound reasons plus others that lack clear ethical or rational justification.

Arguably indeed, there are as many taboos today as yesterday that inform our oftentimes odd and incoherent relationships to our own bodies and minds. As another illustrative example, most of us have probably heard how the Victorians were so prudish that they would conceal the nakedness of their piano legs behind little skirts of modesty (in fact an urban myth), when it is surely more scandalous (at least by today’s standards) that over the counter at the local apothecary drugs including laudanum (tincture of opium) were freely available to all.

It seems indeed that just as we loosened restraints on sexuality, new anxieties began to spring up concerning our relationship with our bodies as such. Suddenly perhaps we had more to measure up to, especially once all the bright young (and rather scantily-clad) things began to parade themselves indecorously if alluringly throughout our daily lives: ubiquitous in movies, on TV, billboards, and in magazines and newspapers. The most intriguing aspect of this hypersexualisation, however, is that modern society has simultaneously remained prudish in many other regards, most curiously in the case of public nudity; an ‘indecency’ that goes completely unrecognised within so-called primitive societies.

In parallel with these changes, our own culture, which increasingly fixates on youthfulness, has simultaneously fallen into the habit of marginalising old age and death. Not that death, as often presumed, now represents our final unuttered taboo, because arguably more shunned even than death is madness, and presumably because its spectre remains so uniquely terrifying to us.

The overarching point is that no society, however permissive, is ever well-disposed toward individuals who fail to measure up to established norms. The rule is perfectly straightforward in fact: in every society and throughout historical times, social deviants are prone to be ostracised. And as a rule, this applies whether one’s behavioural aberrance is a matter of personal choice or not.

I conjecture, moreover, that our abhorrence of madness is actually informed by the very biological classification of our species and sub-species: Homo Sapiens Sapiens. The wise, wise man! By which we discreetly imply (in our determinedly positivist account) the rational, rational man! Thus, to “lose your mind”, as we often say colloquially, involves the loss of the singular vital faculty – dare I say our ‘essential’ faculty? – The very thing that taxonomically defines us.

Of course, we are trespassing on hugely controversial territory and into areas I am (by profession) totally unqualified to enter. This must be conceded, whilst nevertheless, I do have privileged access when it comes to entering and exploring the field, as do you. Because we all have insider knowledge and deeply vested interest when it comes to comprehending the intricate activities of human consciousness, while no-one has the superhuman immunity that ensures perfect mental health – indeed, most people quietly experience episodes, whether passing or more prolonged, when our minds may go a little wonky.

Lastly then, my real purpose is not to dwell on what madness may be, but, arguably more importantly, to consider the consequences of being treated as mad; and in both senses of ‘treated’. So let’s just slip into these white coats. Ready…? Now to begin some informal examination of this rather delicate matter that is of such immediate and absolutely central importance.

*

I        Sorting the sheep from the goats

“Not all who rave are divinely inspired” – Morris Raphael Cohen

*

“The sole difference between myself and a madman is the fact that I am not mad!” said Salvador Dalí. 5 Dalí, with his dangerous flare for showmanship, was keen to impress upon his audience the exceptionally deranged quality of his genius, yet this well-known quip appeals in part because genius and madness are already romantically entwined, especially in the popular imagination.

Genius equates to madness presumably because both elude ordinary forms of thinking, and thus, a rather banal accountancy goes: genius appears as madness when it is anything but. Alternatively, however, and as Dali intimates, genius truly is a form of madness, at least for some. The artistic visionary in particular draws inspiration, if not upon literal hallucinatory visions – as the poet William Blake did – then from the upwelling of deep and uncertain psychological forces within.

Fascinated by the half-light and the liminal, impelled upon occasion to peer into the abyss, the genius in extreme cases, will indeed tread close to the verge of madness. Yet, most geniuses have not gone mad, nor does genius seem especially vulnerable or susceptible to such self-destructive forces. Even amongst greatest artists, exceptions prove to be the rule – the manic depression of Vincent van Gogh, the profound melancholia of Robert Schumann, the self-destructive alcoholism of Jackson Pollack (and it is noteworthy that van Gogh had a taste for the more deadly alcoholic beverage absinthe), the severe neurosis of Edvard Munch (another excessive drinker), and the depression and tragic suicide of Sylvia Plath. There is nothing however to suggest that Shakespeare or Bach were anything other than entirely sane, or that Mozart, Goethe and Beethoven suffered from frailties or maladies of any lasting psychological kind. The same goes for such modern masters as Picasso, Matisse, Stravinsky, and Mahler – though Mahler did consult Sigmund Freud once for advice on a marital crisis shortly before he died. I could go on and on listing countless sane individuals who excelled in the field of the arts or in other disciplines – indeed Salvador Dalí was another: madness for Dalí being primarily an affectation, as cultured and considered as his trademark moustache, rather than a debilitating affliction.

The problem with all romanticised notions of insanity, especially when upholding insanity as the more honest and thus valid conception of an insane world, is twofold. Not only does it detract from the terrible suffering of those victims most truly lost to the world, but also, and vitally, it mistakes madness for freedom. And there is still a further step. Since madness appears to be a natural manifestation, the most extreme of romanticists have more fervently contended that rather than delusionary, such alternative awareness is no less valid, indeed more valid, than more normalised and thus artificial states of domesticated consciousness. This is a wonderfully tempting fancy for all of us who’ve ever had concerns over a loosening “grip on reality”. Consider, for instance, the following syllogistic fallacy: all geniuses are mad, I’m mad ergo…

But this again is a very lazy method for cancelling madness, in which unpleasant reality is cheaply dismissed basically out of arithmetic convenience, and the two negatives – the horrors of the world and the terrors of the mind – are determined to add to zero. It simply isn’t good enough to say that madness doesn’t exist, or that madness does exist but it is natural and thus wholesome, or even that madness is really just sanity in disguise. That said, and albeit in a more inspirational way, Dalí is speaking for most of us. For the greatest barrier keeping many of us outside the padded cell is that, like him, “we are not mad”.

*

If sanity and insanity exist, how shall we know them? The question is neither capricious nor itself insane.

So begins a paper published by the journal Science in January 1973 and written by David L. Rosenhan, a Professor of Psychology at Stanford University. The “Rosenhan experiment”, as it is now known, had in fact involved two related studies, the first of which was certainly one of the most daring ever conducted in the social sciences.

Rosenhan would send seven mentally healthy volunteers, with himself making eight, on a mission to be admitted as patients within the American psychiatric system. These eight courageous ‘pseudopatients’ soon after arrived at the doors of selected hospitals with instructions to say only that they were hearing a voice which pronounced these three words: “empty”, “hollow” and, most memorably, “thud”. If admitted the volunteers were then further instructed to act completely normally and say that had had no recurrence of those original symptoms. 6

What transpired came as a surprise, not least to Rosenhan himself. Firstly, although none of the volunteers had any prior history of mental illness and none were exhibiting behaviour that could be deemed seriously pathological in any way – Rosenhan having ensured that “[t]he choice of these symptoms was also determined by the absence of a single report of existential psychoses in the literature” – every one of his ‘pseudopatients’ were admitted and so became real patients. More alarmingly, and as each quickly realised, they had landed themselves in a seemingly intractable catch-22 situation: for how does anyone prove their sanity, once certified insane?

If you say that you are fine, then who is to decide whether or not your expressed feelings of wellness are not delusional? It was certainly not lost on Rosenhan that this is a position all psychiatric patients inevitably find themselves in. In the event, it would take the eight ‘pseudopatients’ almost three weeks on average (19 days to be precise, and in one instance 52 days) to convince the doctors that they were sane enough to be discharged. But it didn’t end there, because all but one were finally discharged with a diagnosis of schizophrenia “in remission”, and as Rosenhan notes:

The label “in remission” should in no way be dismissed as a formality, for at no time during any hospitalization had any question been raised about any pseudopatient’s simulation. Nor are there any indications in the hospital records that the pseudopatient’s status was suspect. Rather, the evidence is strong that, once labeled schizophrenic, the pseudopatient was stuck with that label. If the pseudopatient was to be discharged, he must naturally be “in remission”; but he was not sane, nor, in the institution’s view, had he ever been sane. 7

For a second experiment, Rosenhan then cleverly turned the tables. With results from his first test released, he now challenged a different research and teaching hospital where staff fervently denied that they would have made comparable errors, telling them that over the period of three months he would send an undisclosed number of new ‘pseudopatients’ and it was up to them to determine which patients were the imposters. Instead Rosenhan sent no one:

Judgments were obtained on 193 patients who were admitted for psychiatric treatment. All staff who had had sustained contact with or primary responsibility for the patient – attendants, nurses, psychiatrists, physicians, and psychologists – were asked to make judgments. Forty-one patients were alleged, with high confidence, to be pseudopatients by at least one member of the staff. Twenty-three were considered suspect by at least one psychiatrist. Nineteen were suspected by one psychiatrist and one other staff member. Actually, no genuine pseudopatient (at least from my group) presented himself during this period. 8

Rosenhan provocatively although accurately titled his paper “On being sane in insane places”. The results of his study had not only undermined the credibility of the entire psychiatric establishment, but his main conclusion that “we cannot distinguish the sane from the insane in psychiatric hospitals”, touched on a far bigger issue. For aside from challenging existing methods of diagnosis, and calling into question the treatment and stigmatisation of mental illness – in view of what he described in the paper as “the stickiness of psychodiagnostic labels” 9 – the results of his study more fundamentally (and thus controversially) cast doubt on how psychological ‘normality’ can ever be differentiated decisively from ‘abnormality’ in all instances? Buried within his paper, Rosenhan posits:

… there is enormous overlap in the behaviors of the sane and the insane. The sane are not “sane” all of the time. We lose our tempers “for no good reason.” We are occasionally depressed or anxious, again for no good reason. And we may find it difficult to get along with one or another person –  again for no reason that we can specify. Similarly, the insane are not always insane.

So the ‘sane’ are not always ‘sane’ and the ‘insane’ are not always ‘insane’, although Rosenhan never leaps to the erroneous conclusion (as others have and do) that there is no essential difference between sanity and insanity. He simply responds to the uncomfortable facts as revealed by his studies and implores other professionals who are involved in care and treatment of psychiatric patients to be extra vigilant. Indeed, he opens his paper as follows:

To raise questions regarding normality and abnormality is in no way to question the fact that some behaviors are deviant or odd. Murder is deviant. So, too, are hallucinations. Nor does raising such questions deny the existence of the personal anguish that is often associated with “mental illness.” Anxiety and depression exist. Psychological suffering exists. But normality and abnormality, sanity and insanity, and the diagnoses that flow from them may be less substantive than many believe them to be.

So though his albeit small experiment had objectively undermined the credibility of both the academic discipline and clinical practice of psychiatry, his conclusions remained circumspect (no doubt he wished to tread carefully), with the closing remarks to his paper as follows:

I and the other pseudopatients in the psychiatric setting had distinctly negative reactions. We do not pretend to describe the subjective experiences of true patients. Theirs may be different from ours, particularly with the passage of time and the necessary process of adaptation to one’s environment. But we can and do speak to the relatively more objective indices of treatment within the hospital. It could be a mistake, and a very unfortunate one, to consider that what happened to us derived from malice or stupidity on the part of the staff. Quite the contrary, our overwhelming impression of them was of people who really cared, who were committed and who were uncommonly intelligent. Where they failed, as they sometimes did painfully, it would be more accurate to attribute those failures to the environment in which they, too, found themselves than to personal callousness. Their perceptions and behaviors were controlled by the situation, rather than being motivated by a malicious disposition. In a more benign environment, one that was less attached to global diagnosis, their behaviors and judgments might have been more benign and effective. 10

*

Before pursuing this matter by delving into deeper complexities, I would like to reframe the central concept almost algebraically. In this regard I am taking the approach of the stereotypical physicist in the joke, who when asked how milk production on a diary farm might be optimised, sets out his solution to the problem as follows: “Okay – so let’s consider a spherical cow…” 11

By applying this spherical cow approach to psychiatry, I have produced the following three crude equivalences, which are listed below (each accompanied by brief explanatory notes).

#1. Insanity = abnormality

Normality, a social construct [from etymological root ‘right-angled’], implies conventionality, conformity and being in good relation to the orthodoxy [from orthos ‘straight or right’] such that a person is adjudged sane when they appear to be well-balanced, rational, and functional.

#2. Insanity = unhealthiness

Health, a medical consideration [from root ‘whole’] indicates a lack of pathology and in this case emphasises something akin to good mental hygiene. ‘Health’ in the sense of mental health will correspond to low levels of stress and anxiety; high self-awareness and self-assuredness; to happiness and well-being.

And lastly,

#3. Insanity = psychological maladjustment to reality [from late Latin realis ‘relating to things’], with emphasis here placed on authenticity and realism as opposed to fantasy and delusion.

There is, of course, a good measure of crossover between these three pseudo-identities. For instance, if you are ‘normal’ (i.e., adjusted to society) then you have a greater likelihood of being ‘happy’ than if you are at variance. Moreover, if you’re well-adjusted socially, society as a whole will likely attest to you being ‘well adjusted’ in a broader psychological sense, because ‘reality’ is always to some extent socially construed. Imagine, for instance, being suddenly transported to the caste ossified and demon-haunted worlds of the Late Middle Ages; would the people determined sane today be thought sane as they disembarked from our imagined time machine, and would they stay sane for long? 12

I have included this rather crude and uncertain section in order to highlight how appearances of ‘madness’ and ‘sanity’ can often be coloured by alternative societal interpretations. As we venture forward, keep this in mind too: societal influences that shape and inform the prevailing notions of ‘normality’, ‘reality’ and even ‘happiness’ are more often latent than manifest.

“Happiness”: the story of a rodent’s unrelenting quest for happiness and fulfilment by Steve Cutts.

*

Did you ever stride warily over the cracks in the pavement? Have you crossed your fingers, or counted magpies, or stepped around a ladder, or perhaps ‘touched wood’ to ward off some inadvertently tempted fate? Most of us have. Are we mad? Not really, just a little delusional perhaps. Though does superstition itself contain the kernel of madness?

What if that compulsion to step across the cracks becomes so tremendous that the pavement exists as a seething patchwork of uncertain hazards? Or if we really, really feel the urge to touch the wooden object over and over until our contact is quite perfect and precise. When the itch is so irresistible and the desire to scratch quite unbearable, this otherwise silly superstition embroils the sufferer (today diagnosed with Obsessive Compulsive Disorder or OCD) in extended rituals that must be fastidiously completed; a debilitating affliction in which everyday routine becomes a torment as life grinds nearly to a halt, the paralysed victim reduced to going round and round interminably in the completely pointless loops of their own devising: life reduced to a barmy and infuriating assault course that is nearly impossible to complete.

As a child, to entertain yourself, did you ever look out for familiar shapes within the amorphous vapour of clouds or the random folds of a curtain? Doubtless you looked up into the night sky to admire the ‘Man in the Moon’, or if you are Chinese, then to spot the rabbit. Both are wrong, and right – connecting the dots being a marvellous human capacity that allows us to be creators extraordinaire. Yet the same aptitude holds the capacity to drive us literally crazy. How about those monsters at the back of your wardrobe or lurking in wait under the bed… and did the devil live around the U-bend of the toilet ready to leap out and catch you if you failed to escape before the flush had ended? It is fun to indulge in such fantasies. Everyone loves a ghost story.

Not that reconstructing faces or other solid forms where none exist involves hallucinating in the truest sense. However, these games, or harmless tics of pattern recognition – which psychologists call pareidolia – do involve our latent faculty for hallucinations – a faculty that is more fully expressed in dreams or just as we are falling asleep and during waking; images technically described as hypnagogic and hypnopomptic respectively. Some of us also hear imaginary things: and not only “things that go bump in the night”, but occasionally things that go bang upon waking (or on the brink of sleeping). This highly disconcerting experience even has the technical name “exploding head syndrome” – just to let you know, in case you ever suffer from it. Alongside still more frightening and otherworldly apparitions (the worst ones are usually associated with sleep paralysis) auditory hallucinations happen to billions of perfectly sober and otherwise sane individuals.

In fact, it is now known that about one percent of people with no diagnosed mental health problem hear voices on a regular basis – this happens to be approximately equivalent to the number of people who are diagnosed with schizophrenia (and it is important to note here that while not all schizophrenics hear voices, nor is schizophrenia the single mental illness in which hearing voices is a symptom). Within the general population, still more of us have fleeting episodes of hearing voices, while very nearly everyone will at some time experience the auditory hallucination of voices on the brink of sleep and waking.

Of course in a different though related sense, we all hear voices: the familiar inner voice that speaks softly as we think, as we read and perhaps as we console ourselves. And how many of us articulate that voice by talking to ourselves from time to time? As young children between the ages of two to eight we all would have done so. Then sometimes as we literally speak our minds, we also find ourselves listening attentively to what we ourselves just said aloud in these unaccompanied chinwags; although catching yourself fully in the act as an adult can often come as a bit of a shock – but a shock to whom exactly? So are we mad to talk to ourselves… or as the joke would have it, just seeking a more intelligent conversation!

In talking to ourselves we immediately stumble upon a remarkable and unexpected division in consciousness too. One—self becomes two selves. The ‘I’ as subjective knower abruptly perceiving a ‘me’ as a separate entity – perhaps this known ‘me’ perceived by the knower ‘I’ is deemed worthy of respect (but perhaps not, the knower can decide!) Curiously this is not just a mind becoming vividly aware of its existence as a manifestation (modern science would say ‘epiphenomenon’, as if this is an adequate explanation) of the brain-body (and such consciousness of the material self is strange enough), but the mind becoming literally self-aware and this self-awareness having endlessly self-reflecting origins, since if ‘I’ begin to think about ‘me’ then there can now exist a further ‘I’ which is suddenly aware of both the original knower and the already known. Fuller contemplation of this expanding hall of mirrors where the self also dwells is very possibly a road to madness: yet this habit of divorcing ‘I’ from ‘me’ is a remarkably familiar one. As usual, our language also gives us away: we “catch ourselves” in the act, afterwards commenting “I can’t believe I did it!” But what if our apprehension of the one—self becomes more broken still, and our sense of being can only be perceived as if refracted through shattered glass: the splintered fragments of the anticipated ‘me’ (whatever this is) appearing horrifically other?

Perhaps we’ve even had intimations of a feeling that we are entirely disconnected from every other part of the universe, and as such, then felt profoundly and existentially cast adrift with no recall of who we are. Such altered states of detachment are known in psychology as ‘dissociation’ and are not uncommon, especially to those with any appetite for ‘recreational substances’. Even alcohol is known to sometimes elicit temporary dissociation. And if these are representative of some of our everyday brushes with madness, then what of our more extended nocturnal lapses into full-blown irrationality: the hallucinations we call dreams and nightmares, and those altogether more febrile deliriums that occasionally take hold when we are physically ill?

These are the reflections of Charles Dickens, after one of his night walks brought on by insomnia led him to nocturnal contemplation of Bethlehem Hospital:

Are not the sane and the insane equal at night as the sane lie a dreaming? Are not all of us outside this hospital, who dream, more or less in the condition of those inside it, every night of our lives? Are we not nightly persuaded, as they daily are, that we associate preposterously with kings and queens, emperors and empresses, and notabilities of all sorts? Do we not nightly jumble events and personages and times and places, as these do daily? Are we not sometimes troubled by our own sleeping inconsistencies, and do we not vexedly try to account for them or excuse them, just as these do sometimes in respect of their waking delusions? Said an afflicted man to me, when I was last in a hospital like this, “Sir, I can frequently fly.” I was half ashamed to reflect that so could I by night. Said a woman to me on the same occasion, “Queen Victoria frequently comes to dine with me, and her Majesty and I dine off peaches and maccaroni in our night-gowns, and his Royal Highness the Prince Consort does us the honour to make a third on horseback in a Field-Marshal`s uniform.” Could I refrain from reddening with consciousness when I remembered the amazing royal parties I myself had given (at night), the unaccountable viands I had put on table, and my extraordinary manner of conducting myself on those distinguished occasions? I wonder that the great master who knew everything, when he called Sleep the death of each day’s life, did not call Dreams the insanity of each day`s sanity. 13

Meanwhile, obsessing over trifling matters is a regular human compulsion. The cap is off the toothpaste. The sink is full of dishes. That’s another tin gone mouldy in the fridge… during times when our moods are most fraught, seething with dull anger and impatient to explode at the slightest provocation, it is the fridge, sink, and the toothpaste that fills our head with troubles. Presumably again there is a limit beyond which such everyday obsessing becomes pathological. Indeed, I dare to suggest that obsessing over mundanities may be a kind of displacement activity: another distraction from the greatest unknown we all face – our certain endpoint with its dread finality. For we may, without lack of justification, dread our entire future; and with it the whole world outside our door: just as we may with due reason, based on past experiences, panic at the prospect of every encounter.

But whereas normal levels of fear act as a helpful defence mechanism and a necessary hindrance, the overbearing anxiety of the neurotic comes to stand in full opposition to life. Likewise, although indignation can be righteous and rage too is warranted on occasions, a constantly seething ill temper that seldom settles is corrosive to all concerned. In short, once acute anxiety and intense irritability worsen in severity and manifest as part of a chronic condition, life is irredeemably spoiled; in still greater severity, anxiety and anger will likely be attributed to symptoms of a psychiatric condition. The threshold to mental illness is once again crossed, but whereabouts was the crossing point?

Each of us has doubtless succumbed to moments of madness, and not just momentary lapses of reason, but perhaps entered into more extended periods when we have been caught up in obsessive and incoherent patterns of thought and behaviour. Loops of loopiness. Moreover, the majority of us will have had occasions of suicidal ideation, which again remain unspoken in part because they signal a psychological frailty that may point to a deeper pathology, or be mistaken as such. Because madness is not really such a faraway and foreign country, and even the sanest among of us (so far as this can be judged), are from time to time permitted entry at its gates.

*

II       Conspiracies against the laity

“That a dictator could, if he so desired, make use of these drugs for political purposes is obvious. He could ensure himself against political unrest by changing the chemistry of his subjects’ brains and so making them content with their servile condition. He could use tranquillizers to calm the excited, stimulants to arouse enthusiasm in the indifferent, halluciants to distract the attention of the wretched from their miseries. But how, it may be asked, will the dictator get his subjects to take the pills that will make them think, feel and behave in the ways he finds desirable? In all probabil­ity it will be enough merely to make the pills available.”

— Aldous Huxley 14

*

In earlier chapters I have discussed how science is soon out of its depth when it comes to understanding the mind and states of consciousness because the province of science is restricted to phenomena that not only can be observed and unambiguously categorised, but thereafter measured with known precision and modelled to an extent that is reliably predictive. Of course, hidden within that statement is an awful lot of maths, however, use of maths is not the issue here, measurement is.

For measurement becomes scientifically applicable once and only once there is a clear demarcation between the quantities we wish to measure. Length and breadth are easy to separate; time and space, likewise. The same case applies to many physical properties – all of the quantities that physicists and chemists take for granted in fact.

When we come to psychology and psychiatry we are likewise retrained. Brain-states are measurable and so we investigate these and then attempt to map our findings back onto sense-impressions, memories and moods. For instance, if we locate a region of the brain where these sense-impressions, memories and moods can be stimulated then we can begin the partial mapping of conscious experience onto brain-states. But we still have not analysed consciousness itself. Nor do we know how the brain-states permit volition – the choice of whether to move, and how and where to move, or, just as importantly, the freedom to think new thoughts. In short, how does our brain actually produce our states of minds, our personalities, and the entity we each call I? As neurologist Oliver Sacks noted in his book A Leg to Stand On in which he drew on his personal experience of a freak mountaineering accident to consider the physical basis of personal identity:

Neuropsychology, like classical neurology aims to be entirely objective, and its great power, its advances, come from just this. But a living creature, and especially a human being, is first and last active – a subject, not an object. It is precisely the subject, the living ‘I’, which is being excluded. Neuropsychology is admirable, but it excludes the psyche – it excludes the experiencing, active, living ‘I’ 15

We as yet have no grounds whatsoever to suppose that science will ever be able to objectively observe and measure states of consciousness. In fact, what would that actually entail? For we do not have even the slightest inkling what consciousness is, or, far more astonishingly, as yet understand how consciousness is routinely and reversibly switched off with use of general anaesthetics, even though general anaesthetics have been widely and effectively used in surgery for over a century and a half.

Moreover, having acknowledged its non-measureability, it is seen as permissible by some scientists to casually relegate consciousness to the status of an epiphenomenon. That is, science takes the singular certainty of our everyday existence and declines from taking any serious interest in its actual reality; in the most extreme case, proclaiming that it is purely illusory… Now think about that for a second: how can you have the ‘illusion of consciousness’? For what vehicle other than a conscious one can support or generate any kind of illusion at all? Although language permits us frame the idea, inherently it is self-contradictory, and proclaiming the illusoriness of consciousness is akin to deciding on the insubstantiality of substance or the unwetness of water.

South African psychoanalyst and neuropsychologist Mark Solms, who has devoted his career to reconnecting these scientific disciplines, here makes a persuasive case built upon studies of brain damaged patients that the source of consciousness cannot lie within the higher level cortex, as has been generally supposed, but instead involves mechanisms operating within the brain stem:

Furthermore, the literal root to our modern terms ‘psychology’, ‘psychoanalysis’ and ‘psychiatry’ is a derivative of the Greek word ‘psyche’ with its origins in ‘spirit’ and ‘soul’, and yet each of the disciplines have altogether abandoned this view in order to bring a strictly biomedical approach to questions of mind. No longer divorced from the brain, mind is thus presumed to be nothing more or less than outputs of brain function, and so the task of today’s clinicians becomes one of managing these outputs by means of physical or chemical adjustments. To these ends, the origins and causes of mental illness are often presumed to be fully intelligible and detectable in abnormalities of brain physiology and most specifically in brain chemistry – this is something I will discuss in greater detail.

Taking such a deeply biochemical approach to mental illness also leads inexorably to questions of genetics since there is no doubt that genes do predispose every person to certain illnesses, and so, with regards to the issue at hand, we might envisage some kind of psychological equivalent to the physical immune system. There is indeed no controversy in saying that the individual propensity to suffering mental illness varies, or that, if you prefer, we inherit differing levels of psychological immunity. Some people are simply more resilient than the average, and others less so and this difference in propensity – one’s ‘psychological immune system’ – is to some extent innate to us.

Of course, if genetic propensity was the primary determinant for rates of mental illness then within any given gene pool we ought to expect a steady level in the rates for diagnosis given that variations within any gene pool change comparatively slowly and over multiple generations. Evidently genetics alone cannot therefore explain any kind of sudden and dramatic rise in incidence of health problems, whether mental or otherwise. One note of caution here: the newer field of epigenetics may yet have something to add to this discussion.

But psyche, to return to the main point, is not a purely biological phenomenon determined solely by genetics, and other wholly material factors such as diet, levels of physical activity and so forth. For one thing, mind has an inherent and irreducible social component and this is the reason solitary confinement or similar forms of deprivation of social stimulus are exceedingly cruel forms of punishment. Taking the still more extreme step of subjecting a victim to the fullest sensory deprivation becomes a terrifying form of torture and one that rapidly induces psychological breakdown. All of this is well-established and yet still the scientific tendency is treat minds just as highly sophisticated programmes running on the wetware of our brains. But the wetware unlike the hardware and software of this computer in front of me possesses both subjectivity and agency. Put another way around: the brain isn’t the conscious agent; you are. And it is equally true to say, as the great theoretical physicist Max Planck elegantly pointed out, that consciousness is absolutely foundational:

I regard consciousness as fundamental. I regard matter as derivative from consciousness. We cannot get behind consciousness. Everything that we talk about, everything that we regard as existing, postulates consciousness. 16

Planck is precisely right to say we cannot get behind consciousness. And by everything he quite literally means everything including of course the brain, although unfortunately we are very much in the bad habit of forgetting this glaring fact.

With developments in neurology and biochemistry, science becomes ever more accomplished at measuring and, again with increasing refinement, is able to alter brain function, and in doing so, to alter states of consciousness. Yet even while a scientist or doctor is manoeuvring a patient’s mind, he remains deeply ignorant of how the change is achieved, and it is worth bearing in mind that methods for alteration of states of consciousness have been known and practiced throughout all cultures long before the advent of science.

To offer a hopefully useful analogy, when tackling problems of consciousness, our best scientists remain in the position of a motorist who lacks mechanical understanding. The steering wheel changes direction and two of pedals make the car go faster or slower – yet another pedal does something more peculiar again that we needn’t dwell on here! Of course, our imaginary driver is able to use all these controls to manoeuvre the car – increasingly well with practice. Added to which he is free to lift the bonnet and look underneath, however, without essential knowledge of engineering or physics, it provides no eye-opening additional insights. Although such an analogy breaks down (if you’ll pardon my pun), as every analogy here must, because as Planck says, when it comes to consciousness all our understanding of the world, all concepts, are contingent on it, including in this instance, the concept of mechanisms.

For these reasons we might quite reasonably ask which factors the psychiatrist ought to invest greater faith in: definite quantities or indefinite qualities? Measureable changes in electrical activity or a patient’s reports of mood swings? Rates of blood flow or recognisable expressions of anxiety? Levels of dopamine or the unmistakeable signs of the patient’s sadness and cheerfulness?

More philosophically, we might wonder deeply into what awareness is. How do we navigate the myriad nooks and crannies of the world that our minds (in a very real sense) reconstruct – our perceptions being projections informed by sensory inputs and produced to give the appearance of external reality – in order to inquire into the nature of both the world and the organs of perception and cognition when the precursory nature of awareness somehow remains tantalisingly beyond all reconstruction? When confronted by these questions science is struck dumb – it is dumbfounded. Obviously, so too is psychiatry.

Mathematician and Quantum Physicist Roger Penrose has devoted a great deal of time thinking about the nature of consciousness and in his best-selling book The Emperor’s New Mind (1989) he explained why science is wrong in presuming it is a purely computational process. In conversation with AI researcher Lex Fridman, here Penrose again stresses our lack of basic scientific understanding of consciousness and proffers his own tentative ideas about where we might begin looking and, in particular, how investigating the causal mechanisms underlying general anaesthetics looks a profitable place to begin:

*

In the early 1960s, tired of signing his name on the skin of naked women, transforming them instantly into living sculptures (and what’s not to like about that?), avant-garde Italian artist, Piero Manzoni turned his hand instead to canning his own excrement and selling his tins to galleries. In May 2007, a single tin of Manzoni’s faeces was sold at Sotheby’s for more than £100,000; more recently in Milan another tin of his crap fetched close to a quarter of a million! It would be madness, of course, to pay anything at all for bona fide excrement (and it remains uncertain whether Manzoni’s labels reliably informed his customers of their literal contents), was it not for the fact that other customers were queuing up and happy to pay as much or more. Indeed, if anyone can ever be said to have had the Midas touch, then surely it was Manzoni; just a flick of his wrist miraculously elevating anything at all to the canonised ranks of high art – literally turning shit into gold.

But then the art world is an arena that excels in perversity and so pointing out its bourgeois pretensions and self-indulgent stupidities has itself become a cheap pursuit, while to the initiated it simply marks me out as another unenlightened philistine. What is blindingly obvious to the rest of us has instead become undetectable to the connoisseur, the banality obscured by fashion and their own self-gratification. In an era that is exceptionally cynical and commercial, it comes as no surprise therefore to find the art world reflecting and extolling works of commensurate cynicism and degeneracy. What is more interesting, however, is this contemporary notion that art has finally become anything done by an artist: for we might reasonably ask, does this same approach to validation apply across other disciplines too? For instance, if scientists collectively decide to believe in a particular method or theory, does this automatically make their shared belief somehow ‘scientific’? I pose this as a serious question.

What is more important here is to understand and recognise how all intellectual fields face a similar risk of losing sight of what is inherently valuable, becoming seduced by collective self-deception and wrapped up in matters of collective self-importance. Peer pressure. Groupthink. The bandwagon effect. If you’ve never seen the footage before then I highly recommend watching Solomon Asch’s ‘conformity experiments’ in which test subjects were found to consistently and repeatedly defer to false opinion and in blatant contradiction to what they could see perfectly clearly and right in front their own eyes. 17

In short, most people will “go along to get along” and this maxim applies across all levels of society and in all spheres of activities including the sciences. Moreover, it is very seldom the case that any scientific paradigm changes because its opponents are suddenly won over by a novel framework of ideas due to its intrinsic elegance or power, but rather as Max Planck put it most bluntly (at least as it usually paraphrased): “Science progresses one funeral at a time”. 18

These problems are additionally compounded by reification: the mistaking of abstractions for solid aspects of reality; of confusing the map with the territory. Related to this is something William James once described as the “Psychologist’s fallacy”:

The great snare of the psychologist is the confusion of his own standpoint with that of the mental fact about which he is making his report. I shall hereafter call this the ‘psychologist’s fallacy’ par excellence. 19

There are actually three ways of interpreting James’ statement here and each of these is equally applicable. The first and most general cautions against mistaking one’s personal perception and interpretation of an event as a perfectly accurate account – this strictly applies to all fields of objective research. The next is that it is easy to mistake another person’s experience and falsely imagine it is identical to your own. This ‘confusion of standpoints’ can cause you to believe you know why someone did what they did believing they are motivated in just the same way you are. Then finally, there is an error that applies in situations whenever you are involved in studying another person’s mental state (for whatever reason and not necessarily in a clinical setting) and you suppose that the subject is likewise critically aware of their own thoughts and actions. This is called ‘attribution of reflectiveness’ and it may occur for instance if you come across someone blocking your way once you then presume that they are fully aware of the obstruction they have caused to your progress and are obviously being inconsiderate.

Besides the issues of groupthink and the fallacies outlined above, there is a related difficulty that arises whenever you are constrained by any systems of classification, and given how incredibly useful categories are (especially in the sciences), this is again hard to avoid. Whenever a system comes to be defined and accepted, the tendency will always be for adherents to look for and find examples that fit and support it; and if this means cherry-picking the facts then so be it. Within no time an entire discipline can spring up this way, as was the case of phrenology (a subject I shall come back to in a later chapter).

*

George Bernard Shaw nattily remarked that “all professions are conspiracies against the laity”. In the same spirit, we might extend his concern adding that such conspiracies will tend to feign understanding, disguise ambiguity and perpetuate fallacies. The quip itself comes from Shaw’s play The Doctor’s Dilemma, and was most pointedly aimed toward the medical profession. But then in defence of doctors, medicine as a discipline is arguably the science most plagued by vagueness; a nearly intractable problem given how symptoms of so many diseases can be easily muddled just because of their inherent similarities. Consider, for instance, the thousand and one ailments that all have “flu-like symptoms”.

In turn, patients are equally prone to vagueness when giving accounts of their own symptoms, in part because symptoms are often rather difficult to describe – just how do you distinguish the various feelings of pain, for instance. To make matters worse, human biology is already fiendishly complex. Textbooks provide only textbook examples: they show ideal anatomy, while real anatomies are seldom ideal and it is a surprisingly common occurrence for actual patients to have organs with structures or locations that are very markedly different.

The unavoidable outcome of all this uncertainty and peculiarity is that medical professionals do not understand nearly half so much as those without medical training are given to believe – and, importantly, choose to believe. Because, as patients, not only do we seek clear diagnoses, but we look to medicine for sure-fire remedies, all of which encourages an inclusion in medical nomenclature of elaborate – and preferably Latinised labels – for the full gamut of our daily complaints. A complete taxonomy that catalogues and accounts for every combination of symptoms and one or two half-glimpsed maladies. All of which brings us to the consideration of ‘syndromes’ and ‘disorders’.

When your doctor diagnoses abc-itis, then presuming the diagnosis is a correct one, it is very certain that you have inflammation of your abc. Diagnoses of thousands of complaints and diseases are absolutely clear-cut like this. However, if told you are suffering from xyz syndrome, it may mean instead that you are presenting a cluster of symptoms which are recognised to occur in a specific combination; a grouping that crops up often enough to have acquired its label ‘xyz syndrome’, rather than a disease with a well-established or single underlying cause. In short, the term ‘syndrome’ will sometimes hide a lot more than it reveals.

Whenever patterns of symptoms have been rolled together and labelled for the sake of convenience under a single catch-all name, here is the shorthand for saying we recognise the signs, and though can’t tell you the cause and as yet remain unable to recommend a cure, we are working on it! And if the shorthand was unavailable, then instead the clinician would have to shrug their shoulders and usher you away, which, given how patients usually have a strong preference for receiving (at the very least) a name for the cause of their suffering, this more customary exchange allows both parties to leave the consultation far happier. We are often content therefore to indulge our medical (and other experts) in maintaining many of these Shavian “conspiracies” against us.

Returning to consider psychiatry, it is necessary to appreciate that all but the rarest of psychiatric diagnoses fall under the category of ‘disorders’ rather than diseases – and that the underlying aetiology in many cases is not just unknown but more or less unconsidered. It follows that historically, the development of diagnosis and treatments has very often had recourse to little more than educated hunches and trial-and-error testing on (all-too often) unwilling patients.

As former National Institute of Mental Health (NIMH) Director, Thomas Insel, pointed out:

“While DSM [Diagnostic and Statistical Manual of Mental Disorders] has been described as a “Bible” for the field, it is, at best, a dictionary, creating a set of labels and defining each. The strength of each of the editions of DSM has been “reliability” – each edition has ensured that clinicians use the same terms in the same ways. The weakness is its lack of validity. Unlike our definitions of ischemic heart disease, lymphoma, or AIDS, the DSM diagnoses are based on a consensus about clusters of clinical symptoms, not any objective laboratory measure. In the rest of medicine, this would be equivalent to creating diagnostic systems based on the nature of chest pain or the quality of fever. Indeed, symptom-based diagnosis, once common in other areas of medicine, has been largely replaced in the past half century as we have understood that symptoms alone rarely indicate the best choice of treatment. Patients with mental disorders deserve better.” 20

*

Psychiatrist: Have you ever heard of the old saying “a rolling stone gathers no moss?”

Patient: Yeah.

Psychiatrist: Does that mean something to you?

Patient: Uh… it’s the same as “don’t wash your dirty underwear in public.”

Psychiatrist: I’m not sure I understand what you mean.

Patient: [smiling] I’m smarter than him, ain’t I? [laughs] Well, that sort of has always meant, is, uh, it’s hard for something to grow on something that’s moving.

If you’ve seen the film One Flew Over the Cuckoo’s Nest 21 then you may recognise the dialogue above. It comes when the central protagonist Randle McMurphy (brilliantly cast as the young Jack Nicholson) is subjected to a follow-up evaluation carried out by a team of three psychiatrists trying to determine whether or not he is fit enough to be discharged.

Released only a couple of years after Rosenhan and his ‘pseudopatients’ had sneaked under the diagnostic radar, and like Rosenhan and his associates, but for reasons which we need not go into, in the film McMurphy is an apparently sane inmate plunged into an infuriating and intractable catch-22 situation.

Now the question posed to McMurphy appears an odd one, yet questions of precisely this kind, commonly based around well known proverbs, were once regularly used for such diagnostic purposes. Just as with the better known Rorschach inkblot test, there is no single ‘correct’ answer, but there were built-in ways a patient might fail such an examination. In this case, responses considered too literal were taken as evidence of pathology on the grounds that they show an inability for the patient to think in ways other than concretely. Simply re-expressing the proverb in order to precisely account for how a rolling rock is an inhospitable environment for vegetation is therefore an ill-advised response.

Indeed, McMurphy’s second answer conclusively fails the test, whereas his first stab at saying something deliberately obtuse merely confuses the three doctors. Of course, in the film it is McMurphy’s deeply rebellious nature and truculent behaviour, rather than the results of tests of this sort that ultimately seal his fate – and again there is no need for details here, but merely to add that whilst the ramifications of Rosenhan’s experiment challenged opinions within academic and professional circles, the multiple Academy Award-winning One Flew Over the Cuckoo’s Nest, reached out to a far wider audience and helped to change the public perception of how we care for the mentally ill. Moreover, Rosenhan’s criticisms had been restrained, whereas the film – like the book – went straight for the jugular.

Author of the book “One Flew Over the Cuckoo’s Nest”, Ken Kesey, was involved with the film adaptation, but for a variety of reasons including a dispute over royalties, he left barely two weeks into production and has since claimed not to have watched the final version. Embedded below is a short interview with Kesey talking about the main characters and interspersed with relevant clips:

In the wake of Rosenhan’s experiment (1972) and Kesey’s fictional portrayal of life inside the asylum (published in 1962, released as a film in 1975), the ‘anti-psychiatry’ movement (a term coined by one of its most prominent advocates, South African psychiatrist David Cooper in 1967) soon began to gain political traction. With the legitimacy of mainstream psychiatry subject to sustained attack and very concept of mental illness suddenly coming under scrutiny, in the midst of this crisis, the American Psychiatric Association (APA) made a decision to release its new manual: a fully updated directory that would authoritatively categorise and thus authenticate all forms of ‘mental disorder’.

The Diagnostic and Statistical Manual of Mental Disorders – soon after known as ‘the bible of psychiatry’ – is now in its fifth edition, DSM-V, and with each updated edition it has become an ever weightier tome, expanding at a faster rate than almost any other technical manual in history. And this snowballing really started in 1968 when the revised second edition introduced an additional seventy-six ‘disorders’, thereby expanding the original 1952 catalogue by more than 70 percent. When revised again in 1980, the DSM-III added a further 83 diagnostic categories; its list growing from 182 (DSM-II) to 265 (DSM-III) – this represents a 150 percent increase on the original. Although less conspicuously, the same trend continued when DSM-IV was released in 1994, which catalogues a total of 410 disorders – almost a three-fold increase on the original.

James Davies is a Reader in Social Anthropology and Mental Health at the University of Roehampton, a psychotherapist, and co-founder of the Council for Evidence Based Psychiatry. In trying to understand how the present manual had come to be constructed he decided to speak to the many of authors directly, and so in May 2012 he took a trip to Princeton. There he was welcomed by Dr Robert Spitzer who had chaired the core team of nine people who put together the seminal third edition of the DSM, which amongst other things established the modern diagnostic system still broadly in operation. It was this edition of the manual that had introduced such household-name disorders as Borderline Personality Disorder and Post-Traumatic Stress Disorder. For these reasons, Spitzer is widely regarded as the most influential psychiatrist of the last century.

Davies began his interview by asking Spitzer what was the rationale behind his significant expansion in number of disorders in the DSM-III edition and Spitzer told him:

“The disorders we included weren’t really new to the field. They were mainly diagnoses that clinicians used in practice but which weren’t recognised by the DSM or the ICD.” 22

Davies then pressed further and asked how many of these disorders had been discovered in a biological sense. In reply Spitzer reminded him that “there are only a handful of mental disorders… known to have a clear biological cause” adding that these organic disorders like epilepsy, Alzheimer’s and Huntington’s are “few and far between”; conceding that no biological markers have been identified for any of the remaining disorders in DSM. With this established, Davies then asked how the DSM taskforce did determine which new disorders to include. Spitzer explained:

“I guess our general principle was that if a large enough number of clinicians felt that a diagnostic concept was important in their work, then we were likely to add it as a new category. That was essentially it. It became a question of how much consensus there was to recognise and include a particular disorder.” 23

Davies also spoke to Dr Theodore Millon, another of the leading lights on Spitzer’s taskforce, to ask more about the construction of their manual. Millon told him:

“There was little systematic research, and much of the research that existed was really a hodgepodge – scattered, inconsistent, and ambiguous. I think the majority of us recognised that the amount of good, solid science upon which we were making our decisions was pretty modest.” 24

Afterwards, Davies had put Millon’s points directly to Spitzer, who responded:

“Well it’s true that for many of the disorders that were added, there wasn’t a tremendous amount of research, and certainly there wasn’t research on the particular way that we defined these disorders… It is certainly true that the amount of research validating data on most psychiatric disorders is very limited indeed.”

Adding that:

“There are very few disorders whose definition was a result of specific research data.” 25

On the basis of Spitzer’s surprising admissions, Davies than tracked down other members of the same DSM team. For instance, he spoke on the phone to Professor Donald Klein, another leader on the taskforce, who said:

“We thrashed it out basically. We had a three-hour argument… If people [at the meeting] were still undecided the matter would be eventually decided by a vote.” 26

And Davies finally decided to check what he was hearing from these members by looking through the minutes of taskforce meetings which are still held in the archives, discovering that voting did indeed take place to make such determinations. Renee Garfinkel, a psychologist who participated in two DSM advisory subcommittees, told Davies more bluntly:

“You must understand what I saw happening in these committees wasn’t scientific – it more resembled a group of friends trying to decide where they want to go for dinner.”

She then cited the following concrete example of how one meeting had proceeded:

“As the conversation went on, to my great astonishment one Taskforce member suddenly piped up, ‘Oh no, no, we can’t include that behaviour as a symptom, because I do that!’ And so it was decided that that behaviour would not be included because, presumably, if someone on the Taskforce does it, it must be perfectly normal.” 27

Although comprised of a rather small team, DSM-III has had far-flung and long-lasting influence on psychiatry. Spitzer told Davies:

“Our team was certainly not typical of the psychiatry community, and that was one of the major arguments against DSM-III: it allowed a small group with a particular viewpoint to take over psychiatry and change it in a fundamental way.

“What did I think of that charge? Well, it was absolutely true! It was a revolution, that’s what it was. We took over because we had the power.” 28

In any case, reliance upon a single definitive and encyclopaedic work of this kind presents a great many hazards. As Allen Frances, the former chairman of the psychiatry department at Duke University School of Medicine who led the taskforce that produced DSM-IV has publicly admitted:

At its annual meeting this week [in May 2012], the American Psychiatric Association did two wonderful things: it rejected one reckless proposal that would have exposed nonpsychotic children to unnecessary and dangerous antipsychotic medication and another that would have turned the existential worries and sadness of everyday life into an alleged mental disorder.

But the association is still proceeding with other suggestions that could potentially expand the boundaries of psychiatry to define as mentally ill tens of millions of people now considered normal.

In the same op-ed published by the New York Times, Frances continued:

Until now, the American Psychiatric Association seemed the entity best equipped to monitor the diagnostic system. Unfortunately, this is no longer true. D.S.M.-5 promises to be a disaster — even after the changes approved this week, it will introduce many new and unproven diagnoses that will medicalize normality and result in a glut of unnecessary and harmful drug prescription. The association has been largely deaf to the widespread criticism of D.S.M.-5, stubbornly refusing to subject the proposals to independent scientific review.

Many critics assume unfairly that D.S.M.-5 is shilling for drug companies. This is not true. The mistakes are rather the result of an intellectual conflict of interest; experts always overvalue their pet area and want to expand its purview, until the point that everyday problems come to be mislabeled as mental disorders. Arrogance, secretiveness, passive governance and administrative disorganization have also played a role.

New diagnoses in psychiatry can be far more dangerous than new drugs. 29

In an earlier interview speaking with Wired magazine, Frances – credited as “the guy who wrote the book on mental illness” – made an even more startling confession, telling Gary Greenberg, who is himself a practicing psychotherapist:

“[T]here is no definition of a mental disorder. It’s bullshit. I mean, you just can’t define it… these concepts are virtually impossible to define precisely with bright lines at the boundaries.” 30

*

The entry of psychiatry into the province of science is a comparatively recent one. Indeed, in the ancient world and times prior to the Enlightenment, some severe forms of mental illness would most likely have appeared the work of demons. And if a person was believed to be possessed, then religious protocols, informed by the opinion that their soul was in existential peril and without intervention would suffer eternal damnation, called for extremely drastic measures.

Indeed, the very word psychiatry derives (as mentioned above) from the Greek psukhē for ‘breath, life, soul’ (Psyche also the Greek goddess of the Soul), though in accordance to the strict biomedical model of mind, psychiatry today takes no interest in these ‘spiritual’ matters. Nevertheless, the interventions of psychiatry to save a person’s mind have often been as drastic, and if anything crueller, than those inflicted throughout prior ages. The dark arts of exorcism or trepanning superseded and upgraded by the aid of technological means: the unfortunate victims, at first, subjected to induced convulsions by the administration of an overdose of insulin, then more latterly by means of high voltage electric shocks passed between the temples (electroconvulsive therapy or ECT). Still more invasive treatments were also introduced throughout the twentieth century that excised a patient’s demons by means of irreversible surgical mutilation.

When we retrace the short but terrible history of psychiatry, it is rather easy to overlook how many of these barbaric pseudoscientific treatments were once lauded as state-of-the-art. As recently as 1949, Portuguese neurologist António Egas Moniz actually shared the Nobel Prize for Medicine for his invention of a routine procedure for carrying out lobotomies; his original procedure refined by Moniz’s mentor, American neurologist Walter Freeman, who used an ice-pick hammered through the eye socket to sever the frontal lobes. Such horrific procedures were frequently performed without anaesthetic and led to the destruction of the minds – although I am tempted to say souls – of tens of thousands of people; the majority of whom were women (also predominant amongst victims were homosexuals). This use of so-called ‘psychosurgery’ was phased out gradually but lobotomies continued to be performed into the 1970s and even later. 31

Today it is also an open, if dirty, secret that throughout modern times, psychiatry has played a pivotal role in the coercion of political opponents of the state. Many authoritarian regimes – the former Soviet Union the most frequently cited – operating their mental health systems as a highly efficient means for cracking down on dissidents (who more or less by definition failed to think ‘normally’). The abuse of psychiatry by western governments is less known, however, at the height of the Cold War, the CIA carried out a whole range of experiments under Sidney Gottleib’s MKUltra mind control programme.

One of the MKUltra researchers was Ewan Cameron, the then-President of the American Psychiatric Association (APA), who went so far as to attempt to entirely erase his patients’ existing memories by means of massive doses of psychotropics and ECT in attempts to reprogram the victim’s psyche from scratch. Decades later, some the survivors won financial rewards as compensation for their part in this secret regime of state-sponsored torture. 32 Moreover, this very close collaboration between military intelligence services and the APA has continued and during the “War on Terror” a number of ‘operational psychologists’ are now known to have worked on CIA’s “enhanced interrogation” torture programme. 33

Of course, state coercion is not always to control political enemies. Minorities who have suffered discrimination for different reasons have likewise fallen victim to psychiatric abuse. In fact, prior to 1973, when homosexuality was designated a disease and placed on the list of ‘mental disorders’ according to the DSM ‘bible’, otherwise healthy gay men were forcibly subjected to treatments involving aversion ‘therapies’ that included electric shock to the genitals and nausea-inducing drugs administered simultaneously with the presentation of homoerotic stimuli. In the Anthony Burgess novel Clockwork Orange (1962) this was called “the Ludovico Technique”.

Thus historically, the insane subject – i.e., anyone who is diagnosed as mentally ill – has been uniquely deprived their basic human rights. Downgraded in social status and transformed de facto into a kind of second class human. Even today, when clinical procedures are kinder, patients are routinely subjected to many involuntary treatments including the long-term administration of powerful drugs and ECT.

*

Leaving aside the moral questions, this terrible history also casts a shadow over the whole science underpinning these treatments. What do we really know about the efficacy of ECT today that we didn’t know in the middle of the last century?

Or consider the now familiar labelling of drugs as ‘antipsychotic’ and ‘antidepressant’: terms that are wholly misleading and deeply unscientific, since the implication is that these are antidotes are much like antibiotics, acting to cure specific disease by targeting the underlying pathology. But this is entirely false, and the reason it is misleading can be best understood by once again reviewing the history of psychiatry.

Firstly, it is important to recognise that none of the first generation of psychiatric drugs was ever developed for the purpose either of alleviating neurological dysfunction or enhancing brain activity. Chlorpromazine (CPZ) – marketed under the brand names Thorazine and Largactil – the earliest of these ‘antipsychotics’ had previously been administered as an antihistamine to relieve shock in patients undergoing surgery, although it was in fact derived from a family of drugs called phenothiazines originally used as antimalarials and to combat parasitic worm infestations. 34

It had been noticed, however, that many of the patients who received Thorazine would afterwards manifest mood changes and in particular experience a deadening in their emotional response to the external world while otherwise retaining full consciousness. In short, the drug happened to reproduce the effects observed in patients who underwent a surgical lobotomy (which in 1950 was still considered a highly effective treatment for psychosis of course).

On the other hand, ‘antidepressants’ emerged as a by-product of research into tuberculosis, after it was noticed that some patients in the trials became more roused following their medication. Only in the aftermath of studies carried during in the 1960s, did science finally begin to understand how these pharmaceuticals were having direct effects within the brains of patients, and specifically on processes involving, respectively, the neurotransmitters dopamine and serotonin. In patients suffering psychosis there was found to be an excess of the former, whereas those suffering depression showed an apparent deficit of the latter. The conclusion followed that the drugs must have been acting to correct an existing imbalance, very much as insulin does in the case of diabetes.

So the conclusions from these early studies were drawn wholly from understanding the mechanism of action of the drugs. Since the antipsychotics were found to block dopamine receptors, the hypothesis formed that the condition of psychosis must be due to an excess of dopamine activity; likewise, since antidepressants held serotonin longer in the synaptic cleft (the space that separates and forms a junction between neurons) boosting the activity, it followed that depression was a result of low serotonin activity. However, this reasoning turns out to be inherently flawed, and as subsequent research had quickly revealed, actual differences in brain chemistry detected in patients were a feature not of the underlying pathology associated with their disorder, but instead a direct effect of the medications used to treat them. Indeed for decades, clued-up pharmacologists and many psychiatric practitioners have regarded the theory of ‘chemical imbalance’ not as a scientific model, but nothing more than a metaphor: a means of explaining the use of the treatment to patients as well as an encouragement.

This is what Ronald W. Pies, Editor-in Chief Emeritus of Psychiatric Times, wrote a decade ago about the ‘theory of chemical imbalance’:

“I am not one who easily loses his temper, but I confess to experiencing markedly increased limbic activity whenever I hear someone proclaim, “Psychiatrists think all mental disorders are due to a chemical imbalance!” In the past 30 years, I don’t believe I have ever heard a knowledgeable, well-trained psychiatrist make such a preposterous claim, except perhaps to mock it. On the other hand, the “chemical imbalance” trope has been tossed around a great deal by opponents of psychiatry, who mendaciously attribute the phrase to psychiatrists themselves. And, yes—the “chemical imbalance” image has been vigorously promoted by some pharmaceutical companies, often to the detriment of our patients’ understanding. In truth, the “chemical imbalance” notion was always a kind of urban legend – never a theory seriously propounded by well-informed psychiatrists.” 35

David Cohen is Professor of Social Welfare and Associate Dean for Research and Faculty Development at UCLA Luskin. His research looks at psychoactive drugs (prescribed, licit, and illicit) and their desirable and undesirable effects as socio-cultural phenomena “constructed” through language, policy, attitudes, and social interactions. Here he explains how psychiatry has painted itself into a corner and became unable to look for treatments for mental illness that lie outside the biomedical model, which treats all conditions of depression, anxiety and psychosis as brain disorders:

Today we have become habituated to the routine ‘medication’ of our youth with children as young as six years old being administered tranquilisers relabelled as ‘antidepressants’ and ‘antipsychotics’ that are intended ‘to cure’ dysfunctions like “oppositional defiant disorder”. These considerations bring us to the broader issue of what constitutes ‘mental health’, and by extension, what it is to be ‘normal’.

Moreover, it hardly needs saying that increased diagnosis and prescription of medication of every variety is demanded by the profit motive of pharmaceutical industry, so for now, I wish merely to add that we have no demonstrable proof that the identified rise in mental illness is wholly attributable to a commensurate rise in mental illness rather than an artefact bound up with the medicalisation of the human condition. However, given that mental health is expressly bound up with, and to a great extent defined by a person’s feelings of wellness, attempts to downgrade or dismiss patient testimony or to overrule personal accounts of psychological distress, declaring some parts of it illusory, are not only callous but another kind of category mistake. Whatever terminology we apply it is evident that more people than ever are suffering forms of psychological distress. I shall consider this at greater length in the final section.

*

Before continuing, I would like to introduce a genuinely serendipitous finding – a newspaper clipping torn out by someone I have never met, and left inside the cover of a second-hand book for reasons I shall never know. I cannot even reference this item because I have no idea in which newspaper it was originally printed, and so will simply label it Exhibit A (the author’s name is also redacted out of courtesy):

“Someone close to me has been smoking cannabis for many years,” the author tells us, adding “That person has never worked and lives in a state of euphoria.”

From these preliminary remarks it is actually hard to tell whether the writer is issuing a caution or an endorsement for pot smoking – or at least it would be hard to tell, were it not for our informed social prejudices, and since the presumed social norm is that work is always good and drugs (meaning illegal ones) unconditionally bad. Suppose, however, this surmised state of euphoria had been ascribed to quite different causes. Let’s say, for example, that the person in question was in love, or that s/he’d found God, or alternatively that s/he had been proscribed a legally sanctioned medicine lifting them from a prior state of depression and anxiety, and this lasting euphoria was the outcome. Would this not be a good thing?

But the next part of the letter is perhaps the most interesting part. It begins: “People on cannabis lose touch with reality. They cannot relate to normal life because they are in a perpetual state of relaxation, and doing everyday tasks or even getting up in the morning is a big deal. They drift from day to day.”

At this point, I ought to make a personal confession. The person described here is me – not me in all actuality, but another me, another drifter. It is me and a considerable number of my closest friends, who have spent a great many years smoking pot and “losing touch with reality”. Doubtless, it will describe the lives of some of the people who happen to read this too. Personally, I gave up smoking pot years ago for health reasons, and I do not advise others to follow my lead either way. Undeniably, there is some truth within the letter, but there is also a great deal of misunderstanding.

Do pot smokers live in realms of make-believe? Do we care about nothing? Interestingly, we could just as easily ask the same question of those proscribed SSRI (selective serotonin reuptake inhibitor) antidepressants like Prozac, and all of the other legally sanctioned mind-altering substances. Leaving aside social acceptance, which surely owes much to the profit motive, what other distinction can we make here once we dismiss the false hypothesis of redressing chemical imbalance?

Of course, none of us ever knows what might otherwise have been had they not done such and such. The road not taken is forever unknown. The only fair question therefore must involve regret, and I confess that I do not regret my decision to smoke pot, nor do I know any friends who have told me they regret their own choice in this regard. The important point I wish to emphasise is that legal determinations do not automatically establish what is to our better health and well-being, and nor do they determine what is right and wrong in a moral sense. Indeed, who dares to tell another adult how they ought to think, and equally who dares to say how one may or may not alter their consciousness by whatever means they see fit? If we are not entirely free to think as we choose, then as creatures so fully submerged in our thoughts, we can hardly be said to free at all.

*

Here is David Cohen again, discussing how psychiatric drugs are no different in kind from many street drugs:

David Nutt, Professor of Neuropsychopharmacology at Imperial College London, has closely studied the range of harms that legal and illegal drugs can do to individuals and society. On the basis of his findings, he has reached the perhaps surprising conclusion that policy should begin with an end to the criminalisation of all drug use and possession. In March 2021 he was interviewed by Novara Media’s Ash Sarkar:

Comedian Bill Hicks also his own opinions on why some drugs are taxed when others are banned [warning: very strong language and opinions!]:

*

III      Driven crazy?

“[P]eople who experience themselves as automata, as robots, as bits of machinery, or even as animals… are rightly regarded as crazy. Yet why do we not regard a theory that seeks to transmute persons into automata or animals as equally crazy?”

— R. D. Laing 36

*

Type the words ‘mental health crisis’ into any search engine and you will find more than a million pages with links to reports from Australia, Canada, Europe and America all presenting stark evidence that the Western world is in the grip of what in other contexts would certainly be called a pandemic: a plague of disease that is horribly debilitating, too often fatal, and affecting nearly one in ten of our population: men and women, children and the old alike. According to the latest surveys in any given week in England, 1 in 6 people (15%) report experiencing some kind of mental health problem. In just twenty years (1993 to 2014) the number of people experiencing mental health problems went up by 20%, while the number reporting severe mental health symptoms in any given week has risen from 7% in 1993 to over 9% in 2014. 37 Indeed, this issue has now become such a grave one that it receives serious attention in political debates. Still more positively, ways to deal with it are today widely discussed, and the stigma associated with mental illness is at last aired and challenged across the mainstream. But one question very seldom addressed is this: what has generated so much suffering and distress in the first place? What is the cause of this now admitted mental health crisis?

Since the issue is obviously an extremely complex one, I propose that we break it down into three parts that can be abbreviated as three A’s: access, accountancy and aetiology. The most simplistic assumption we could make would be that our current crisis is a consequence of just one of these three factors. So, for instance, if the rise in case numbers is a purely matter of easier access to treatment, then it follows from our presumption that there is no underlying increase but that sufferers of mental health problems are simply more able and willing to seek professional help. If true then ‘the crisis’ has always existed but previously the greatest number simply suffered in silence.

Alternatively, we might presume that the rise is a perceived one and its origin is entirely due to changes in accountancy, in which instance states of mind that in the past were undifferentiated from the norm have gradually been medicalised as I have discussed above. Whereas improved access to care is a laudable good, by contrast, if accountancy is to blame, then society is increasingly in the business of treating the sane as if they were sick. Reclassifying normality as abnormality would mean psychiatry has helped create the illusion of an epidemic, although it is important to understand that it does not follow that the suffering itself is illusory, only that our tendency is to see that suffering as psychiatric in nature.

Alternatively again, we might instead conclude that the rise in cases is real and unrelated to either ease of access or what has been described as “the medicalisation of misery”. In this case, we are necessarily drawn into the matter of aetiology and must extend the investigation to search for underlying external causes – causes that to some degree can be found to account for a genuine rise in mental illness.

Certainly these aren’t mutually exclusive considerations, but are these three A’s exhaustive? Broadly considered yes, however, a breakdown of this kind has indistinct fuzzy edges and all that is certain is a combination, or potentially even a synergy, operates between the three. Indeed, given that mental health is expressly bound up with and unavoidably defined by feelings of wellness, no psychiatric diagnosis can ever be scientifically objective in the strictest sense. Setting aside therefore the matter of access to better healthcare, which all else being equal, is wholly positive, my considerations in the remainder of this chapter are to disentangle the other strands.

In one sense the mental health crisis is undeniably real. More and more people are suffering forms of psychological distress and in no way do I mean to suggest otherwise. There is an urgent need therefore to get to the bottom of what is causing this crisis.

Johann Hari is a journalist who spent many years investigating the causes of depression, the reasons why the West is seeing such a rise in incidence, and how we might find better alternatives to treat this epidemic. It isn’t caused by a chemical imbalance in our brains, he notes at the outset, but crucially by changes in the way we are living:

*

The evidence of a connection between what happens in childhood and the effects on later behaviour is very strong indeed. This is unsurprising of course. It is perhaps self-evident that mental illness grows out of trauma and hunger, which are the bitter fruits of abuse, neglect and abandonment, both physical and psychological. But to explain the ongoing rise (affecting adults as much as children) we would be hard pressed to attribute much cause to changes in parenting styles given how the rise is so steep with a 20% increase over just two decades – very definitely not if Philip Larkin is to be believed. 38

To be frank, parents have always “fucked you up”, as for that matter have our siblings, our peers, and undoubtedly, many of our fucked-up teachers. Of course, one significant change during recent decades is that parents spend more time working, thus leaving children with childminders or, if money is tight, with the keys to an empty house. Studies again unsurprisingly show that latchkey kids are more susceptible to behavioural problems.

A related issue affecting early development is the omnipresence of new technologies. Once the pacifier was television, but this single room distraction has been slowly superseded by the introduction of computer games, iphones, etc. There is a widespread dependency on these types of electronic devices, and so without any immediate control group, the psychological damage caused by habitually engaging in such virtual interactions will be extremely difficult to gauge.

Of course, television has been used as an infant pacifier ever since I can remember. No doubt it once pacified me too. But television itself has been radically transformed. It has become louder, brighter, more intense due to faster and slicker editing, and it is surely reasonable to presume, since the sole purpose is to grab attention and transfix its audience, more and more intoxicating. Viewing TV today is a fundamentally altered experience compared to viewing it decades ago. Could any of this be having a knock-on effect with regards to attention span, cognitive skills, or, more importantly, our sense of self? This is a highly complex issue that I shall not delve into here – in the addendum I do however consider the psychological and societal impacts of advertising (I also dedicate a later chapter to the role advertising plays in our society).

What is known for certain is this: that other than in exceptional instances when the origin of severe mental illness can be directly traced to an underlying physical disease (syphilis is perhaps the best known example), the usual trigger for mental health problems is found to be either sudden or prolonged trauma – very often although not exclusively childhood trauma – and the development of the vast majority of mental disorders occurs therefore as a pathological but defensive response to trauma.

*

Following Freud, causes of mental illness came to be thought buried deep within the patient’s unconscious. For this reason, Freud and the psychoanalysts pioneered their ‘talking cure’: conversational techniques that probed deep into the psyche. Various schools arose. They inquired into dreams, biography, sexuality, family relations or even spirituality, feeling down for the roots of their patent’s distress. With the psychical wound discovered, it might now be cleansed and disinfected by means of further introspection. Healing came about as nature then took its course. Here the patient plays a central role in their own treatment.

R. D. Laing dignified his patients in another way. Refraining from excessive presumptions built on the unsteady and evolving theories of the unconscious – the Oedipal Complex, Penis Envy, and other fabulous chimera detected by Freud and his followers – Laing gave his patients the common respect the rest of us outside the padded walls of the asylum receive from our peers. No matter how superficially crazy, he adjudged every patient’s account of his or her lived experience as entirely valid in the existential sense as he would the truthful account of any sane human being, including his own. This exceedingly hazardous (some might say reckless) approach to a patent’s illness did, however, produce remarkable outcomes – at least to begin with – as many of those he treated were speedily recovered and declared fit enough to return home.

However, Laing’s successes seldom lasted long, and predictably within a just few months, more than half would drift back into his care. Witnessing this cyclical pattern of decline had an interesting effect on Laing, for it caused him to reach a new and shocking conclusion. With no less conviction than before, he let it be known that social relationships, and especially ones within families, were the major triggers of his patients’ relapse. This was an audacious diagnosis which, unsurprisingly, met with general hostility, as the accused – not only the families but society as a whole – felt immediately affronted by the charge that they were fons et origo of the patient’s sickness.

Undaunted, Laing took his ideas to their logical extreme. He allowed his patients to play out their madness to the full, believing that for a lasting cure the condition must be allowed to run its course – and who can honestly say if and when madness is fully cured? Unconstrained by the boundaries of orthodox medicine, Laing and his fellow therapists would enter perilously into the worlds of their patients. Laing himself, by all accounts, went somewhat bonkers in the process, which is hardly surprising, since whatever madness is, it is most certainly contagious (and after all, this in a nutshell is really Laing’s central point). 39

As his conduct became morally questionable – sexual affairs with his patients creating troubles within his own family – his professional reputation was understandably tarnished and alongside this reputational decline, his ideas went out of fashion. In spite of this, Laing’s legacy persists in important ways. The more dignified respect for sufferers of mental illness (who even today are sadly denied full human rights equivalence) owes a great deal to Laing’s daring intellectual courage and integrity. On the other hand, the true and lasting value of Laing’s work has been both forgotten and dismissed. For when he tells us that insanity is “a perfectly rational adjustment to an insane world” 40, then given the rise of today’s ‘mental health crisis’, our mental health professionals and society more broadly needs to listen up.

In a world that’s ever slicker, faster, and as human contact becomes more distant and superficial, increasingly artificial indeed, the modern self (perhaps that should read ‘postmodern’) becomes more atomised and systematised than in Laing’s time (Laing died three decades ago). Cajoled to sacrifice ever more individuality for the sake of conformity, convenience, security and status; our given raison d’etre is to engorge our material well-being, either for its own pleasure or, more egotistically, with shows of conspicuous consumption. We are, as T.S. Eliot put it so elegantly, “distracted from distraction by distraction/ filled with fancies and empty of meaning”. 41

*

“The normal process of life contains moments as bad as any of those which insane melancholy is filled with, moments in which radical evil gets its innings and takes its solid turn. The lunatic’s visions of horror are all drawn from the material of daily fact. Our civilization is founded on the shambles, and every individual existence goes out in a lonely spasm of helpless agony.” 42

These are the grim observations of William James, another pioneer of the field of psychology, who is here trying to get to grips with “the unpleasant task of hearing what the sick souls, as we may call them in contrast to the healthy-minded, have to say of the secrets of their prison-house, their own peculiar form of consciousness”. James’ vocabulary is remarkably direct and unambiguous, so allow me to very briefly skim the thesis of what he saw as the underlying cause of madness, sticking closely to his original terminology wherever possible.

Their “morbid-minded way”, James reluctantly concedes, should not be too readily dismissed. “With their grubbing in rat-holes instead of living in the light; with their manufacture of fears, and preoccupation with every unwholesome kind of misery…” it may appear to the “healthy-minded” as “unmanly and diseased”, but, on the other hand, “living simply in the light of good”, although “splendid as long as it will work”, involves us in a partial denial of reality which “breaks down impotently as soon as melancholy comes”. Furthermore, says James:

“… there is no doubt that healthy-mindedness is inadequate as a philosophical doctrine, because the evil facts which it refuses positively to account for are a genuine portion of reality; and they may after all be the best key to life’s significance, and possibly the only openers of our eyes to the deepest levels of truth.”

With the advent of modern comforts and our immersive condition of historically unprecedented safety and security it can appear that those of born in the wealthiest regions of the world have little reason to grumble, certainly when compared to the conditions of previous generations. Indeed for anyone in Britain born into the working class or above, the famous words of Tory Prime Minister Harold Macmillan that “we’ve never had it so good” do mostly still apply. Studies have shown, of course, that social equality is far more closely correlated to overall levels of happiness than absolute levels of wealth 43, but no less apparent is the more straightforward fact that having become materially satisfied, what we might call ‘psychological immiseration’ is more widespread than ever.

With material wants met we are left to tread a vertiginous tightrope that has been called ‘happychondria’: that perpetual and single-minded pursuit of happiness per se that makes us achingly self-aware of shortcomings in this narrow regard. And feelings of an ‘unbearable lightness of being’ become all the lighter once our striving to be happy burgeons into an all-consuming monomaniacal fixation, since happiness is insufficient to ground us and make us feel real. Worse still, as James explains, perpetual happiness is absolutely unattainable due to the inevitable travails of life, and given most people’s tangential urge to negotiate life’s experiences authentically. Or putting matters the other way around, since most people inevitably fail to attain the levels of happiness socially demanded, such non-stop pursuit of happiness (and by ‘pursuit’ here I mean ‘chasing’ rather than ‘activity’ or ‘recreation’ 44) inevitably will have adverse effects and very likely result in neurosis and feelings of moroseness. The etymological root of our word ‘happiness’ is revealing in this regard: ‘hap’ meaning luck or good fortune. Dormant in the language a vestigial memory that happiness is a gift bestowed, rather than a treasure seized.

*

Unable to function for long or to endure everyday states of consciousness, a growing number of people are now turning either to legally prohibited narcotics or proscribed and medically endorsed opiates: drugs that lift the clouds of emptiness, or else, numb the user to the tawdriness of everyday reality. These pills offer a surrogate escape when it can no longer be supplied by the local shopping mall, or, and always more persuasively, by TV and similar distractions online – both places where our big pharmaceutical companies go to enhance their profits by continuously pushing more of their psychoactive wares.

To a great extent, these powerful industries, whether through lobbying or via alternative means of self-promotion, have gradually reshaped psychiatry itself. The patient who was once central to their own treatment has been made peripheral once again, as the psychiatrist gets on with mending their mental apparatus. And by ‘mending’ it is better to read ‘made happier’, or else, ‘made normal’, and thus subjected to a transformation which is centred on societal functioning, but that may or may not be life enhancing in a fuller and more meaningful sense. So does it finally matter if society becomes ‘happier’ and people are better able to cope due only to a widespread use of pharmaceuticals? And does it matter if children as young as six are proscribed a daily dose of mind-altering drugs just to fit in and get by? 45

What if anguish and sorrow are vital parts to an authentic experience of life, and, as a good friend and poet once put it: “woe is part of the worship”? To rebut sorrow and utterly disregard the origins of distress seems to me irredeemably Panglossian, which is surely no less life-denying than its counter opposite, a fatalistic surrender to misery. Because to be able truly to affirm in capitals – to say “YES” – is finally predicated on our capability to no less defiantly scream “NO”! In the finish it is zombies alone that are unable ever to scream “NO!” and especially once confronted by the reoccurring cruelties and stupidities of this sometimes monstrous world.

Fritjof Capra says that Laing once told him, “Mystics and schizophrenics find themselves in the same ocean, but the mystics swim whereas the schizophrenics drown.” And latent within even the most zombified of people, there must linger, no matter how faintly, an inextinguishable inner presence akin to spirit, to soul; a living force that cannot be fully disabled without untold consequences. It is this inner life that fights on and kicks against the main object it can kick against: those modes of thinking and behaving that the ‘normal world’ sanctions and calls ‘sane’, but which the organism (sometimes correctly) identifies as aspects of an inexplicable, incomprehensible and literally terrifying existential threat.

This is how Laing understood the nature of madness, and Laing was one of the sanest (both under legal and more popular definitions) ever to have stayed so close to its shadow. He studied the mad without ever flinching away; listening on with patient compassion to their plight. He stayed open and survived. In an important sense, he trusted their testimony. If we wish to understand what is happening to us, I believe we ought to trust just one of his findings too. As Laing concludes in the same preface to his book The Divided Self:

“Thus I would wish to emphasize that our ‘normal’ ‘adjusted’ state is too often the abdication of ecstasy, the betrayal of our true potentialities, that many of us are only too successful in acquiring a false self to adapt to false realities” 46

While on another occasion he wrote still more emphatically:

“From the alienated starting point of our pseudo-sanity, everything is equivocal. Our sanity is not ‘true’ sanity. Their madness is not ‘true’ madness. The madness of our patients is an artefact of the destruction wreaked on them by us and by them on themselves. Let no one suppose that we meet ‘true’ madness any more than that we are truly sane. The madness that we encounter in ‘patients’ is a gross travesty, a mockery, a grotesque caricature of what the natural healing of that estranged integration we call sanity might be. True sanity entails in one way or another the dissolution of the normal ego, that false self competently adjusted to our alienated social reality; the emergence of the ‘inner’ archetypal mediators of divine power, and through this death a rebirth, and the eventual reestablishment of a new kind of ego-functioning, the ego now being the servant of the divine, no longer its betrayer.” 47

As with death per se, we choose on the whole to remain oblivious to our all-embracing deathly materialist existence, excepting a dwindling minority who our secular society marginalise as deluded and misguided at best, and at worst cranks or fanatics – and there are many religious cranks and fanatics, of course, just as there are no less fanatical anti-religious zealots. Perhaps, to paraphrase Philip Larkin, the rest of us really ought to be screaming. Whether stultified or petrified, inwardly, many are, and that’s where the pills come in.

Laing did not mistake madness for normality, but understood perfectly well that normality can often be madness too. And normality, in turn, after being exposed as madness, has deliberately misunderstood Laing ever since.

Next chapter…

*

Addendum: Advertising vs. sanity

The following brief extract is drawn from an article by satirist Hugh Iglarsh based around an interview with activist and award-winning documentary filmmaker Jean Kilbourne that was published in Counterpunch magazine in October 2020.

HI: What kind of personality does advertising cultivate? How would you describe the ideal consumer or recipient of advertising?

JD: The ideal ad watcher or reader is someone who’s anxious and feels incomplete. Addicts are great consumers because they feel empty and want to believe that something out there, something for sale, can fill them up. Perhaps the ideal consumer is someone suffering from bulimia, because this person will binge and gorge and then purge, thus needing to start the cycle all over again.

HI: Addiction is one of the major themes of your book. How does advertising help foster addiction?

JD: The selling of addictive products is of course a big part of what advertisers do. They study addiction very closely, and they know how addicts think – they literally know what lights up their brains.

Advertisers understand that it is often disconnection in childhood that primes people for addiction. For many traumatized people, the first time they drink or smoke or take drugs may be the very first time they feel okay. Soon they feel they are in a relationship with the alcohol or the cigarette. Addicts aren’t stupid – the stuff they’re taking really does work, at least at first. It numbs the pain, which makes them feel connected to the substance. Eventually the drug or substance turns on them and makes all the problems they’re fleeing from worse.

What struck me about the genius of advertisers is how they exploit themes of tremendous importance to addicts, such as their fear of loneliness and desire for freedom. This is precisely what addiction does to you – it seems to offer you what you need, while actually making you more dependent, more alone. The ads promise freedom and connection, in the form of products that entrap users and weaken relationships. 48

In Chapter Eight, The Unreal Thing, I present my own thoughts on the detrimental impact of advertising on modern culture.

*

Please note that for the purposes of ‘publishing’ here I have taken advantage of the option to incorporate hypertext links and embed videos – in order to distinguish additional commentary from the original text all newly incorporated text has been italised.

*

1 Extract from The Divided Self: An Existential Study in Sanity and Madness by R. D. Laing, first published 1959/60; “Preface to the Pelican Edition” written September 1964.

2 From an article entitled “Asylum tourism” by Jennifer L. Bazar and Jeremy T. Burman, published in Monitor on Psychology, February 2014, Vol 45, No. 2. https://www.academia.edu/11707191/Asylum_tourism_In_the_19th_century_travelers_visited_asylums_to_admire_the_institutions_architecture_and_grounds

3 Sometimes quoted in Latin as Quos Deus vult perdere, prius dementat (literally: Those whom God wishes to destroy, he first deprives of reason) or Quem Iuppiter vult perdere, dementat prius (literally: Those whom Jupiter wishes to destroy, he first deprives of reason) this expression has been used in English literature since at least the 17th century. In the form presented here it first appeared in the Reverend William Anderson Scott’s book Daniel, a Model for Young Men and then later in Longfellow’s poem The Masque of Pandora. Although falsely attributed to Euripides, earlier versions of this phrase do indeed have classical Greek origins.

4 The shift in attitude towards sexual practices as extreme as sadomasochism is a curious one. I take the liberal view that it is right to be fully tolerant of activities that do not injure innocent parties and so do not wish to infringe individual freedoms when they do not violate the freedom of others. Nevertheless, I tend to regard sexual practices such as sadomasochism as perverse, and not because I do not understand them, but because I do. I recognise the urge that twists pleasure and pain together; the same one that mixes up vulnerability with humiliation. The psychological dangers are abundantly clear to me and the fact that our society today actively promotes and normalises S/M is perhaps indicative of a traumatic breakdown in human relations.  It is wonderful that society has overcome so many of its hang-ups, but all taboos aren’t equal. Taboos against inflicting severe pain, even when consensual, do make sense.

Sarah Byrden, a sex educator and sacred sexuality teacher, says we are simultaneously (without realising it) “being bounced off the walls between pornography and Puritanism”:

5    Salvador Dalí is certainly attributed with a quote along these lines.

6

After calling the hospital for an appointment, the pseudopatient arrived at the admissions office complaining that he had been hearing voices. Asked what the voices said, he replied that they were often unclear, but as far as he could tell they said “empty,” “hollow,” and “thud.” The voices were unfamiliar and were of the same sex as the pseudopatient. The choice of these symptoms was occasioned by their apparent similarity to existential symptoms. Such symptoms are alleged to arise from painful concerns about the perceived meaninglessness of one’s life. It is as if the hallucinating person were saying, “My life is empty and hollow.” The choice of these symptoms was also determined by the absence of a single report of existential psychoses in the literature.

Beyond alleging the symptoms and falsifying name, vocation, and employment, no further alterations of person, history, or circumstances were made. The significant events of the pseudopatient’s life history were presented as they had actually occurred. Relationships with parents and siblings, with spouse and children, with people at work and in school, consistent with the aforementioned exceptions, were described as they were or had been. Frustrations and upsets were described along with joys and satisfactions. These facts are important to remember. If anything, they strongly biased the subsequent results in favor of detecting insanity, since none of their histories or current behaviors were seriously pathological in any way.

Immediately upon admission to the psychiatric ward, the pseudopatient ceased simulating any symptoms of abnormality. In some cases, there was a brief period of mild nervousness and anxiety, since none of the pseudopatients really believed that they would be admitted so easily. Indeed, their shared fear was that they would be immediately exposed as frauds and greatly embarrassed. Moreover, many of them had never visited a psychiatric ward; even those who had, nevertheless had some genuine fears about what might happen to them. Their nervousness, then, was quite appropriate to the novelty of the hospital setting, and it abated rapidly.

Apart from that short-lived nervousness, the pseudopatient behaved on the ward as he “normally” behaved. The pseudopatient spoke to patients and staff as he might ordinarily. Because there is uncommonly little to do on a psychiatric ward, he attempted to engage others in conversation. When asked by staff how he was feeling, he indicated that he was fine, that he no longer experienced symptoms. He responded to instructions from attendants, to calls for medication (which was not swallowed), and to dining-hall instructions. Beyond such activities as were available to him on the admissions ward, he spent his time writing down his observations about the ward, its patients, and the staff. Initially these notes were written “secretly,” but as it soon became clear that no one much cared, they were subsequently written on standard tablets of paper in such public places as the dayroom. No secret was made of these activities.

The pseudopatient, very much as a true psychiatric patient, entered a hospital with no foreknowledge of when he would be discharged. Each was told that he would have to get out by his own devices, essentially by convincing the staff that he was sane. The psychological stresses associated with hospitalization were considerable, and all but one of the pseudopatients desired to be discharged almost immediately after being admitted. They were, therefore, motivated not only to behave sanely, but to be paragons of cooperation. That their behavior was in no way disruptive is confirmed by nursing reports, which have been obtained on most of the patients. These reports uniformly indicate that the patients were “friendly,” “cooperative,” and “exhibited no abnormal indications.”

Extract taken from Rosenhan DL (January 1973) entitled “On being sane in insane places” published in Science 179 (4070): 250–8. http://web.archive.org/web/20041117175255/http://web.cocc.edu/lminorevans/on_being_sane_in_insane_places.htm

7    Ibid.

8    Ibid.

9

A psychiatric label has a life and an influence of its own. Once the impression has been formed that the patient is schizophrenic, the expectation is that he will continue to be schizophrenic. When a sufficient amount of time has passed, during which the patient has done nothing bizarre, he is considered to be in remission and available for discharge. But the label endures beyond discharge, with the unconfirmed expectation that he will behave as a schizophrenic again. Such labels, conferred by mental health professionals, are as influential on the patient as they are on his relatives and friends, and it should not surprise anyone that the diagnosis acts on all of them as a self-fulfilling prophecy. Eventually, the patient himself accepts the diagnosis, with all of its surplus meanings and expectations, and behaves accordingly. [Ibid.]

10  Ibid.

11 Physicists – at least all the one I’ve known – whether they’ve heard it before or not (and they generally have heard it before), get the joke immediately; non-physicists, on the other hand, I refer to the old saw that “many a true word is spoken in jest.” For such blunt reductionism certainly does lie at the heart of physics, and indeed of all ‘hard science’; disciplines that are founded upon the simplification of the infinitely complex processes of the natural world. With its especial penchant for ‘elegance’ and parsimoniousness, every physicist is trained through repeated worked examples, and eventually hard-wired to consider the most straightforward and ideal case as the most productive first step in solving every problem: hence the spherical cow. The funny thing is, how often it works!

Consider a Spherical Cow became the title of a book about methods of problem solving using simplified models written by Environmental Scientist John Harte, published in 1988.

In a letter to Science journal published in 1973 the author Steven D. Stellman instead postulated “A Spherical Chicken”. https://science.sciencemag.org/content/182/4119/1296.3

12 The fact that no-one is actually able to answer this question says a lot about time machines – but that’s for a separate discussion!

13    From the essay Night Walks written by Charles Dickens, originally published in the weekly journal All Year Round in 1859, and appearing as Chapter 13 of The Uncommercial  Traveller (1861).

14 From Aldous Huxley’s Brave New World Revisited (1958), chapter 8 “Chemical Persuasion”

15 From Oliver Sack’s A Leg to Stand On (1984), chapter VII “Understanding”

16 From an interview in The Observer published January 25, 1931.

17 In 1951, Solomon Asch conducted his first conformity laboratory experiments inviting groups of male college students to participate in a simple “perceptual” task, which involved distinguishing between three lines labelled A,B and C to decide which matched the length of another comparator line on a different card. In reality, all but one of the participants was an actor, and the true focus of the study was how the remaining participant would react to the actors’ behaviour. Each participant was asked in turn to say aloud which line matched the length of that on the first card and seated such that the real participant always responded last.

In the control group, with no pressure to conform to actors, the error rate on the critical stimuli was less than 1%. In the actor condition also, the majority of participants’ responses remained correct (63.2%), but a sizable minority of responses conformed to the actors’ (incorrect) answer (36.8 percent). The responses revealed strong individual differences: 5% of participants were always swayed by the crowd and only 25% consistently defied majority opinion; the rest conforming on some trials. Overall, 75% of participants gave at least one incorrect answer out of the 12 critical trials. In his opinion regarding the study results, Asch put it this way: “That intelligent, well-meaning, young people are willing to call white black is a matter of concern.”

18 This is sometimes called ‘Planck’s Principle’ and it is taken from the following passages drawn from Wissenschaftliche Selbstbiographie. Mit einem Bildnis und der von Max von Laue gehaltenen Traueransprache. [trans: Scientific Autobiography. With preface and obituary by Max von Laue] Johann Ambrosius Barth Verlag (Leipzig 1948), p. 22, in Scientific Autobiography and Other Papers, (1949), as translated by F. Gaynor, pp. 33–34, 97.

“A new scientific truth does not generally triumph by persuading its opponents and getting them to admit their errors, but rather by its opponents gradually dying out and giving way to a new generation that is raised on it. … An important scientific innovation rarely makes its way by gradually winning over and converting its opponents: it rarely happens that Saul becomes Paul. What does happen is that its opponents gradually die out, and that the growing generation is familiarized with the ideas from the beginning: another instance of the fact that the future lies with the youth.”

19 William James, Principles of Psychology volume I. chapter vii. p. 196, 1890.

20    Transforming Diagnosis, a post by former National Institute of Mental Health (NIMH) Director Thomas Insel, published by NIMH on April 29, 2013. https://www.nimh.nih.gov/about/directors/thomas-insel/blog/2013/transforming-diagnosis.shtml

21  The film (released 1975) was the adaptation of a novel of the same name written by Ken Kesey and published more than a decade earlier in 1962. Kesey based his story on experiences he had had working late shifts as an orderly at a mental health institution, as well as more personal experiences of using psychodelics.

22 Quote taken from Cracked: Why Psychiatry is Doing More Harm Than Good (2012) by James Davies, Chapter 2, “The DSM – a great work of fiction?”

23 Ibid.

24 Ibid.

25 Ibid.

26 Ibid.

27 Ibid.

28 Ibid.

29 From an article entitled “Diagnosing the D.S.M.” written by Allen Francis, published in The New York Times on May 11, 2012. http://www.nytimes.com/2012/05/12/opinion/break-up-the-psychiatric-monopoly.html?_r=1

30 From an article entitled “Inside The Battle To Define Mental Illness” written by Gary Greenberg, published in Wired magazine on December 27, 2010. https://www.wired.com/2010/12/ff_dsmv/

31 Although the practice continued in France into the 1980s, whereas, perhaps surprisingly, it had been banned already on moral grounds by 1950 in the Soviet Union.

32

The Montreal Experiments were carried out on patients suffering from schizophrenia that used sensory deprivation, ECT and drugs (included drug induced coma) combined with “psychic driving” which was an early form of brainwashing involving pre-recorded audio tapes played non-stop for days with up to half a million repetitions altogether. One of Cameron’s victims was Jean Steel, whose daughter Alison (only four and a half at the time of her mother’s treatment) told CBC News in an interview:

“She was never able to really function as a healthy human being because of what they did to her.”

From an article entitled “Federal government quietly compensates daughter of brainwashing experiments victim” written by Elizabeth Thompson, published by CBC News on October 26, 2017. https://www.cbc.ca/news/politics/cia-brainwashing-allanmemorial-mentalhealth-1.4373590

Embedded below is an episode from CBC investigative documentary series The Fifth Estate entitled “MK Ultra: CIA mind control program in Canada” that was first broadcast in 1980:

33 An article titled “Rorschach and Awe” written by Katherine Eban, published in Vanity Fair in July 2007 reported that:

A psychologist named Jean Maria Arrigo came to see me with a disturbing claim about the American Psychological Association, her profession’s 148,000-member trade group. Arrigo had sat on a specially convened A.P.A. task force that, in July 2005, had ruled that psychologists could assist in military interrogations, despite angry objections from many in the profession. […]

Two psychologists in particular played a central role: James Elmer Mitchell, who was attached to the C.I.A. team that eventually arrived in Thailand, and his colleague Bruce Jessen. Neither served on the task force or are A.P.A. members. Both worked in a classified military training program known as SERE—for Survival, Evasion, Resistance, Escape—which trains soldiers to endure captivity in enemy hands. Mitchell and Jessen reverse-engineered the tactics inflicted on SERE trainees for use on detainees in the global war on terror, according to psychologists and others with direct knowledge of their activities. The C.I.A. put them in charge of training interrogators in the brutal techniques, including “waterboarding,” at its network of “black sites.” In a statement, Mitchell and Jessen said, “We are proud of the work we have done for our country.”

https://www.vanityfair.com/news/2007/07/torture200707?printable=true%C2%A4tPage=all

An article titled “The Black Sites” written by Jane Mayer, published in The New Yorker in August 2007 picked up the same story:

The use of psychologists [on the SERE program] was also considered a way for C.I.A. officials to skirt measures such as the Convention Against Torture. The former adviser to the intelligence community said, “Clearly, some senior people felt they needed a theory to justify what they were doing. You can’t just say, ‘We want to do what Egypt’s doing.’ When the lawyers asked what their basis was, they could say, ‘We have Ph.D.s who have these theories.’ ” He said that, inside the C.I.A., where a number of scientists work, there was strong internal opposition to the new techniques. “Behavioral scientists said, ‘Don’t even think about this!’ They thought officers could be prosecuted.”

Nevertheless, the SERE experts’ theories were apparently put into practice with Zubaydah’s interrogation. Zubaydah told the Red Cross that he was not only waterboarded, as has been previously reported; he was also kept for a prolonged period in a cage, known as a “dog box,” which was so small that he could not stand. According to an eyewitness, one psychologist advising on the treatment of Zubaydah, James Mitchell, argued that he needed to be reduced to a state of “learned helplessness.” (Mitchell disputes this characterization.)

https://www.newyorker.com/magazine/2007/08/13/the-black-sites

A subsequent Senate Intelligence Committee report from 2014 confirms that:

The CIA used two outside contract psychologists to develop, operate, and assess its interrogation operations. The psychologists’ prior experience was at the Air Force Survival, Evasion, Resistance and Escape (SERE) school. […]

The contractors developed the list of enhanced interrogation techniques and personally conducted interrogations of some of the CIA’s most significant detainees using those techniques. The contractors also evaluated whether detainees’ psychological state allowed for the continued use of the techniques, even for some detainees they themselves were interrogating or had interrogated. […]

In 2005, the psychologists formed a company to expand their work with the CIA. Shortly thereafter, the CIA outsourced virtually all aspects of the program. The CIA paid the company more than $80 million.

https://www.feinstein.senate.gov/public/index.cfm/senate-intelligence-committee-study-on-cia-detention-and-interrogation-program

34

“The discovery of phenothiazines, the first family of antipsychotic agents has its origin in the development of the German dye industry, at the end of the 19th century (Graebe, Liebermann, Bernthsen). Up to 1940 they were employed as antiseptics, antihelminthics and antimalarials (Ehrlich, Schulemann, Gilman). Finally, in the context of research on antihistaminic substances in France after World War II (Bovet, Halpern, Ducrot) the chlorpromazine was synthesized at Rhône-Poulenc Laboratories (Charpentier, Courvoisier, Koetschet) in December 1950. Its introduction in anaesthesiology, in the antishock area (lytic cocktails) and “artificial hibernation” techniques, is reviewed (Laborit), and its further psychiatric clinical introduction in 1952..”

From the abstract to a paper entitled “History of the Discovery and Clinical Introduction of Chlorpromazine” authored by Francisco Lopez-Muñoz, Cecilio Alamo, Eduardo Cuenca, Winston W. Shen, Patrick Clervoy and Gabriel Rubio, published in the Annals of Clinical Psychiatry, 17(3):113–135, 2005. https://www.researchgate.net/publication/7340552_History_of_the_Discovery_and_Clinical_Introduction_of_Chlorpromazine

35 Psychiatry’s New Brain-Mind and the Legend of the “Chemical Imbalance” written by Ronald W. Pies, Editor-in Chief Emeritus and published by Psychiatric Times on July 11, 2011. http://www.psychiatrictimes.com/couch-crisis/psychiatrys-new-brain-mind-and-legend-chemical-imbalance

36 Extract from The Divided Self: An Existential Study in Sanity and Madness by R. D. Laing, first published 1959/60; Part 1, Chapter 1, “The Existential-Phenomenological Foundations for A Science of Persons”.

37 McManus S, Bebbington P, Jenkins R, Brugha T. (eds.) (2016). Mental health and wellbeing in England: Adult psychiatric morbidity survey 2014.

38 Larkin’s celebrated poem This be the Verse which begins with the lines “They fuck you up, your Mum and Dad/ They may not mean to, but they do” was written and first published in 1971.

39 One of Laing’s great interests was in the “double bind” situation, which he came to diagnose as the root cause for most of the madness around him. Laing had adopted the idea of the “double bind” from anthropologist Gregory Bateson. Bateson, in turn, had traced the notion back to a semi-autobiographical novel by Victorian Samuel Butler, entitled The Way of All Flesh. But Butler had only described the condition and not named it, whereas Bateson had rediscovered it and labelled it as an important cause of schizophrenia.

Hearing from a parent, for instance, that “I love you” whilst seeing no expression which supported the evidence of that expressed love, presented the patient with a “double-bind” situation. This is just one example, but Laing had witnessed this and many other kinds of “paradoxical communication” in his patients’ relationships to their nearest and dearest. He eventually came to believe, along with Bateson, that being caught in such a “double-bind” situation was existentially damaging and very commonly, therefore, psychologically crippling. In recognising this, Laing had undoubtedly discovered a fragment of the truth, and it is a shame that he then over-intellectualises the issue, as intellectuals are wont to do. Replace “double bind” with “mind game” and his case becomes much clearer. If people, especially those you are closest to you and those you need to trust, constantly undermine your view of yourself and of your relationship to others, then the seeds of destruction are being sown. But to my mind, such details of Laing’s outlook are nothing like as interesting and illuminating as the general thrust of what he had to say about our society.

40 As quoted in Wisdom for the Soul: Five Millennia of Prescriptions for Spiritual Healing (2006) by Larry Chang, p. 412; this might be a paraphrase, as the earliest occurrence of this phrase thus far located is in the form: “Ronald David Laing has shocked many people when he suggested in 1972 that insanity can be a perfectly rational adjustment to an insane world.” in Studii de literatură română i comparată (1984), by The Faculty of Philology-History at Universitatea din Timioara. A clear citation to Laing’s own work has not yet been found.

41 From the first of T.S. Eliot’s Four Quartets titled Burnt Norton.

42 This passage continues:

“If you protest, my friend, wait till you arrive there yourself! To believe in the carnivorous reptiles of geologic times is hard for our imagination—they seem too much like mere museum specimens. Yet there is no tooth in any one of those museum-skulls that did not daily through long years of the foretime hold fast to the body struggling in despair of some fated living victim. Forms of horror just as dreadful to their victims, if on a smaller spatial scale, fill the world about us to-day. Here on our very hearths and in our gardens the infernal cat plays with the panting mouse, or holds the hot bird fluttering in her jaws. Crocodiles and rattlesnakes and pythons are at this moment vessels of life as real as we are; their loathsome existence fills every minute of every day that drags its length along; and whenever they or other wild beasts clutch their living prey, the deadly horror which an agitated melancholiac feels is the literally right reaction on the situation.”

Extract taken from The varieties of religious experience: study in human nature, Lectures VI and VII, “The Sick Soul”, William James (1902)

43 In their 2009 book The Spirit Level: Why More Equal Societies Almost Always Do Better authors Richard G. Wilkinson and Kate Pickett examined the major impact that inequality has on eleven different health and social problems: physical health, mental health, drug abuse, education, imprisonment, obesity, social mobility, trust and community life, violence, teenage pregnancies, and child well-being. The related Equality Trust website that was co-founded by the authors also includes scatter plots from their book. One of these shows a remarkably close correlation between prevalence of mental illness and income inequality with the following explanatory notes attached:

“Until recently it was hard to compare levels of mental illness between different countries because nobody had collected strictly comparable data, but recently the World Health Organisation has established world mental health surveys that are starting to provide data. They show that different societies have very different levels of mental illness. In some countries only 5 or 10% of the adult population has suffered from any mental illness in the past year, but in the USA more than 25% have.

“We first showed a relationship between mental illness and income inequality in eight developed countries with WHO data – the USA, France, Netherlands, Belgium, Spain, Germany, Italy, and Japan. Since then we’ve been able to add data for New Zealand and for some other countries whose surveys of mental illness, although not strictly comparable, use very similar methods – Australia, the UK and Canada. As the graph [above] shows, mental illness is much more common in more unequal countries. Among these countries, mental illness is also more common in the richer ones.”

More Information

Pickett KE, James OW, Wilkinson RG. Income inequality and the prevalence of mental illness: a preliminary international analysis. Journal of Epidemiology and Community Health 2006;60(7):646-7.

Wilkinson RG, Pickett KE. The problems of relative deprivation: why some societies do better than others. Social Science and Medicine 2007; 65: 1965-78.

James O. Affluenza, London: Vermilion, 2007.

Friedli L. Mental health, resilience and inequalities: how individuals and communities are affected, World Health Organisation. 2009.

Wilkinson RG, Pickett KE. The Spirit Level. Penguin. 2009. Buy the book from Amazon.

The notes and graph are also available by following the link: https://www.equalitytrust.org.uk/mental-health

44 A distinction I owe to American Archetypal Psychologist and former Director of Studies the C.G. Jung Institute in Zurich, James Hillman.

45 The facts speak for themselves really. For instance, a 2011 report from Centers for Disease Control and Prevention (CDC) reveals that in just ten years antidepressant use in the US has increased by a staggering 400%.

http://www.cbsnews.com/8301-504763_162-20123062-10391704.html

The report reveals that more than one in ten of the American population aged 12 or over is taking antidepressants. But that’s okay, according to “the authors of the report” because: “… many people who could benefit from antidepressants aren’t taking them. Only a third of people with symptoms of severe depression take antidepressants.”

The same report also reveals how a further 8% of Americans without depressive symptoms take the drugs for other reasons such as anxiety. And what about the population below 12 years old? Well, the following is taken from a report on what’s happening closer to home, published by the Guardian in March 2011 and which begins:

“Children as young a four are being given Ritalin-style medication for behavioural problems in breach of NHS guidelines.”

http://www.guardian.co.uk/society/2011/mar/18/behaviour-drugs-four-year-olds

According to official UK guidelines, children over the age of six can now be prescribed with mind-altering substances and even when these are to be administered on a daily basis.

46 Extract from The Divided Self: An Existential Study in Sanity and Madness by R. D. Laing, first published 1959/60; “Preface to the Pelican Edition” written September 1964. Laing adds: “But let it stand. This was the work of a young man. If I am older, I am now also younger.”

47 R. D. Laing, The Politics of Experience  (Ballantine Books, N.Y., 1967)

48 From an article entitled “Advertising vs. Democracy: An Interview with Jean Kilbourne” written by Hugh Iglarsh, published in Counterpunch magazine on October 23rd 2020. https://www.counterpunch.org/2020/10/23/advertising-vs-democracy-an-interview-with-jean-kilbourne/

7 Comments

Filed under « finishing the rat race », Uncategorized