Tag Archives: Iain McGilchrist

living the dream

The following article is an Interlude between Parts I and II of a book entitled Finishing The Rat Race which I am posting chapter by chapter throughout this year. Since blog posts are stacked in a reverse time sequence (always with the latest at the top), I have decided that the best approach is to post the chapters in reverse order.

All previously uploaded chapters are available (in sequence) by following the link above or from category link in the main menu, where you will also find a brief introductory article about the book itself and why I started writing it.

*

“Once upon a time, I, Chuang Chou, dreamt I was a butterfly, fluttering hither and thither, to all intents and purposes a butterfly. I was conscious only of my happiness as a butterfly, unaware that I was Chou. Soon I awaked, and there I was, veritably myself again. Now I do not know whether I was then a man dreaming I was a butterfly, or whether I am now a butterfly, dreaming I am a man.”

— Chuang Tzu 1

*

Before proceeding further, I’d like to tell a joke:

A man walks into a doctor’s.

“Doctor, Doctor, I keep thinking I’m a moth,” the man says.

The doctor gives him a serious look. “Sorry, but I am not strictly qualified to help you” he replies, rubbing his chin earnestly before adding after a momentary pause, “You really need to see a psychiatrist.”

“Yes,” says the man, “but your light was on.”

*

There can be no doubting that each of us acts to a considerable extent in accordance to mental processes that are distantly beyond and often alien to our immediate conscious awareness and understanding. For instance, in general we draw breath without the least consideration, or raise an arm, perhaps to scratch ourselves, with scarcely a thought and zero comprehension of how we actually moved our hand and fingers to accomplish the act. And this everyday fact becomes more startling once we consider how even complex movements and sophisticated patterns of behaviour seem to originate without full conscious direction or awareness.

Consider walking for instance. After admittedly painstaking practice as infants, we soon become able to walk without ever thinking to swing our legs. Likewise, if we have learnt to drive, eventually we are able to manoeuvre a large vehicle with hardly more conscious effort that we apply to walking. The same is true for most daily tasks which are performed no less thoughtlessly and that, in spite of the intricacies, we often find boring and mundane. For instance, those who have been smokers may be able to perform the rather complicated art of rolling a cigarette without pausing from conversation. Indeed, deep contemplation will probably leave us more bewildered than anything by the mysterious coordinated manipulation of all eight fingers and opposing thumbs.

Stranger still is that our ordinary conversational speech proceeds before we have formed the fully conscious intent to utter our actual words! When I first heard this claim, it struck me as so unsettling that I automatically rejected it outright in what ought perhaps to be called a tongue-jerk reaction. (Not long afterwards I was drunk enough to stop worrying about the latent implications!) For considered dispassionately, it is self-evident that there isn’t remotely sufficient time to construct each and every utterance consciously and in advance of the act of speaking; so our vocal ejaculations (as they once were unashamedly called) are just that – they are thrown out! Still further proof is provided by instances when gestures or words emerge in direct conflict to our expressed beliefs and ideas. Those embarrassing occasions when we blurt out what we know must never be spoken we call Freudian slips (and more on Freud below).

More positively, and especially when we enter ‘the zone’, each of us is able to accomplish complex physical acts – for instance throwing, catching, or kicking a ball – and again before any conscious thought arises to do so. Those who have played a sport long enough can probably recall many joyous moments when they have marvelled not only at their own impossible spontaneity, but the accompanying accuracy, deftness, nimbleness, and on very rare occasions even of enhanced physical strength. Likewise, urges, feelings, fears and sometimes the most profound insights will suddenly spring forth into “the back of our minds”, as if from nowhere. And as a consequence, this apparent nowhere acquired a name: coming to be known as “the preconscious”, “the subconscious” and more latterly, “the unconscious”.

What this means, of course, is that “I” am not what I ordinarily think I am, but in actuality a lesser aspect of a greater being who enjoys remarkable talents and abilities beyond what are ordinarily thought “my own” since they lie outside “my” immediate grasp. In this way, we all have hidden depths that can and do give rise to astonishment, although for peculiar reasons of pride, we tend in general to feign ignorance of this everyday fact.

*

The person most popularly associated with the study of the human unconscious is Sigmund Freud, a pioneer in the field but by no means a discoverer. In fact philosopher and all-round genius Gottfried Leibniz is someone with a prior claim to the discovery; making the suggestion that our conscious awareness may be influenced by “insensible stimuli” that he called petites perceptions 1. Another giant of German philosophy, Immanuel Kant, also subsequently proposed the existence of ideas lurking of which we are not fully aware, while admitting the apparent contradiction inherent in such a conjecture:

“To have ideas, and yet not be conscious of them, — there seems to be a contradiction in that; for how can we know that we have them, if we are not conscious of them? Nevertheless, we may become aware indirectly that we have an idea, although we be not directly cognizant of the same.” 2

Nor is it the case that Freud was first in attempting any kind of formal analysis of the make-up and workings of the human psyche as an entity. Already in 1890, William James had published his own ground-breaking work Principles of Psychology, and though James was keen to explore and outline his principles for human psychology by “the description and explanation of states of consciousness”, rather than to plunge more deeply into the unknown, he remained fully aware of the potentiality of unconscious forces and made clear that any “‘explanation’ [of consciousness] must of course include the study of their causes, conditions and immediate consequences, so far as these can be ascertained.” 3

*

William James’ own story is both interesting and instructive. As a young man he had been at somewhat of a loss to decide what to do with himself. Having briefly trained as an artist, he quickly realised that he’d never be good enough and became disillusioned with the idea, declaring that “there is nothing on earth more deplorable than a bad artist”. He afterwards retrained in chemistry, enrolling at Harvard in 1861 (a few months after the outbreak of the American Civil War), but restless again, twelve months or so later, transferred to biology. Still only twenty-one, James soon felt that he was running out of options, writing in a letter to his cousin:

“I have four alternatives: Natural History, Medicine, Printing, Beggary. Much may be said in favour of each. I have named them in the ascending order of their pecuniary invitingness. After all, the great problem of life seems to be how to keep body and soul together, and I have to consider lucre. To study natural science, I know I should like, but the prospect of supporting a family on $600 a year is not one of those rosy dreams of the future with which the young are said to be haunted. Medicine would pay, and I should still be dealing with subjects which interest me – but how much drudgery and of what an unpleasant kind is there!”

Three years on, James then entered the Harvard Medical School, where he quickly became disillusioned. Certain that he no longer wished to become a practicing doctor, and being more interested in psychology and natural history than medicine, a fresh opportunity arose, and he soon set sail to the Amazon in hopes of becoming a naturalist. However, the expedition didn’t work out well either. Fed up with collecting bugs and bored with the company of his fellow explorers, to cap everything, he fell quite ill. Although desperate to return home, he was obliged to continue, and, slowly he regained his strength, deciding that in spite of everything it had been a worthwhile diversion; no doubt heartened too by the prospect of finally returning home.

It was 1866, when James next resumed medical studies at Harvard although the Amazon adventure had left him physically and (very probably) psychologically weakened; a continuing sickness that forced James to break off from his studies yet again. Seeking rest and recuperation, for the next two years James sojourned in Europe, where, to judge from his own accounts, he again experienced a great deal of isolation, loneliness and boredom. Returning to America at the end of 1868 – now approaching twenty-seven years old – he picked up his studies at Harvard for the last time, successfully passing his degree to become William James M.D. in 1869.

Too weak to find work anyway, James had stayed resolute in his unwillingness to become a practicing doctor. So for a prolonged period, he did nothing at all, or next to nothing. Three years passed when, besides the occasional publication of articles and reviews, he devoted himself solely to reading books or thinking thoughts, and often quite gloomy ones. Suddenly, one day, he then had a semi-miraculous revelation: a very dark revelation that made him exceedingly aware not only of his own mental fragility, but the likely prognosis:

“Whilst in this state of philosophic pessimism and general depression of spirits about my prospects, I went one evening into the dressing room in the twilight… when suddenly there fell upon me without any warning, just as if it came out of the darkness, a horrible fear of my own existence. Simultaneously there arose in my mind the image of an epileptic patient whom I had seen in the asylum, a black-haired youth with greenish skin, entirely idiotic, who used to sit all day on one of the benches, or rather shelves, against the wall, with his knees drawn up against his chin, and the coarse gray undershirt, which was his only garment, drawn over them, inclosing his entire figure. He sat there like a sort of sculptured Egyptian cat or Peruvian mummy, moving nothing but his black eyes and looking absolutely non-human. This image and my fear entered into a species of combination with each other. That shape am I, I felt, potentially. Nothing that I possess can defend me against that fate, if the hour for it should strike for me as it struck for him. There was such a horror of him, and such a perception of my own merely momentary discrepancy from him, that it was as if something hitherto solid within my breast gave way entirely, and I became a mass of quivering fear. After this the universe was changed for me altogether. I awoke morning after morning with a horrible dread at the pit of my stomach, and with a sense of the insecurity of life that I never knew before, and that I have never felt since. It was like a revelation; and although the immediate feelings passed away, the experience has made me sympathetic with the morbid feelings of others ever since.” 4

Having suffered what today would very likely be called ‘a nervous breakdown’, James was forced to reflect on the current theories of the mind. Previously, he had accepted the materialist ‘automaton theory’ – that our ability to act upon the world depends not upon conscious states as such, but upon the brain-states that underpin and produce them – but now he felt that if true this meant he was personally trapped forever in a depression that could only be cured by the administering of some kind of physical remedy. However, no such remedy was obtainable, and so he was forced instead to tackle his disorder by means of further introspection and self-analysis.

James read more and thought more since there was nothing else he could do. Three more desperately unhappy years would pass before he had sufficiently recuperated to rejoin the ordinary world, accepting an offer to become lecturer in physiology at Harvard. But as luck would have it, teaching suited James. He enjoyed the subject of physiology itself, and found the activity of teaching “very interesting and stimulating”. James had, for once, landed on his feet, and his fortunes were also beginning to improve in other ways.

Enjoying the benefits of a steady income for the first time in his life, he was soon to meet Alice Gibbons, the future “Mrs W.J.” They married two years later in 1878. She was a perfect companion – intelligent, perceptive, encouraging, and perhaps most importantly for James, an organising force in his life. He had also just been offered a publishing contract to write a book on his main specialism, which was by now – and in spite of such diversity of training – most definitely psychology. With everything now in place, James set to work on what would be his magnus opus. Wasting absolutely no time whatsoever, the opening chapters were drafted while still on their honeymoon together.

“What is this mythological and poetical talk about psychology and Psyche and keeping back a manuscript composed during honeymoon?” he wrote in jest to the taunts of a friend, “The only psyche now recognized by science is a decapitated frog whose writhings express deeper truths than your weak-minded poets ever dreamed. She (not Psyche but the bride) loves all these doctrines which are quite novel to her mind, hitherto accustomed to all sorts of mysticisms and superstitions. She swears entirely by reflex action now, and believes in universal Nothwendigkeit. [determinism]” 5

It would take James more than a decade to complete what quickly became the definitive university textbook on the subject, ample time for such ingrained materialist leanings to have softened. For the most part sticking to what was directly and consciously known to him, his attempts to dissect the psyche involved much painstaking introspection of what he famously came to describe as his (and our) “stream of consciousness”. Such close analysis of the subjective experience of consciousness itself had suggested to James the need to distinguish between “the Me and the I” as separate component parts of what in completeness he called “the self”. 6 In one way or another, this division of self into selves, whether these be consciously apprehensible or not, has remained a theoretical basis of all later methods of psychoanalysis.

There is a joke that Henry James was a philosopher who wrote novels, whereas his brother William was a novelist who wrote philosophy. But this does WJ a disservice. James’ philosophy, known as pragmatism, is a later diversion. Unlike his writings about psychology, which became the standard academic texts, as well as popular best-sellers (and what better tribute to James’s fluid prose); his ideas on pragmatism were rather poorly received (they have gained more favour over time). But then James was a lesser expert in philosophy, a situation not helped by his distaste for logical reasoning; and he would be better remembered for his writings on psychology, a subject in which he excelled. Freud’s claim to originality is nothing like as foundational.

James was at the vanguard during the period psychology irreparably pulled apart from the grip philosophy had held on it (which explains why James was notionally Professor of Philosopher at the time he was writing), and as it was grafted back to form a subdiscipline of biology. For this reason, and regardless that James remained as highly critical of the developing field of experimental psychology; as he was too of the deductive reasoners on both sides of the English Channel – the British Empiricists of Locke and Hume, and the continental giants Leibnitz, Kant and Hegel – to some of his contemporaries, James’ view appeared all too dangerously materialistic. If only they could have seen how areas of psychology were to so ruinously develop, they would have appreciated that James was, as always, a moderate.

*

While James had remained an academic throughout his whole life, Freud, though briefly studying zoology at the University of Vienna, with one month spent unsuccessfully searching for the gonads of the male eel 7, and another spell doing neurology, decided then to return to medicine and open his own practice. He had also received expert training in the new-fangled techniques of hypnosis.

‘Hypnosis’ comes from the Greek hupnos and means, in effect, “artificial sleep”. To induce hypnosis, the patient’s conscious mind needs to be distracted briefly, and achieving this opens up regions of the mind beyond the usual conscious states. The terms “sub-conscious” and “unconscious” had been in circulation already and prior to the theories of Freud or James. And whether named or not, mysterious evidence of the unconscious had always been known. Dreams, after all, though we consciously experience them, are neither consciously conceived nor willed. They just pop out from nowhere – or from “the unconscious”.

From his clinical experiences, Freud soon discovered what he believed to be better routes to the unconscious than hypnosis. For instance, he found that it was just as effective to listen to his patients, or if their conscious mind was unwilling to give up some of its defences – as it commonly was – then to encourage their free association of words and ideas. He also looked for unconscious connections within his patients’ dreams, gradually uncovering, what he came to believe were the deeply repressed animalistic drives that govern the patient’s fears, attitudes and behaviour. Having found the unconscious root to their problems, the patient could finally begin to grapple with these repressed issues at an increasingly conscious level. It was a technique that apparently worked, with many of Freud’s patients recovering from the worst effects of their neuroses and hysteria, and so “the talking cure” became a lasting part of Freud’s legacy. You lay on the couch, and just out of sight, Freud listened and interpreted.

But Freud also left a bigger mark, by helping to shape the way we see ourselves. The types of unconscious repression he discovered in his own patients, he believed were universally present, and through drawing directly on his experiences as doctor, he slowly excavated, as he found it, the entire human unconscious piece by piece. Two of these aspects he labelled as the ‘superego’ and the ‘id’: the one a seat of primal desires, the other a chastising moral guide – these are reminiscent of the squabbling devil-angel duo that pop up in cartoons, jostling for attention on opposite shoulders of the character whenever he’s plunged into a moral quandary. 8

In a reboot of philosopher Arthur Schopenhauer’s concept of blind and insatiable ‘will’, Freud proposed the existence of the libido: a primary, sexual drive that ceaselessly operates beneath our conscious awareness, prompting desires for pleasure and avoidance of pain irrespective of consequence and regardless to whether these desires conflict with ordinary social conventions. In concert with all of this, Freud discerned a natural process of psychological development 9 and came to believe that whenever this development is arrested or, more generally, whenever normal appetites are consciously repressed, then lurking deep within the unconscious, such repressed but instinctual desires will inevitably and automatically resurface in more morbid forms. This, he determined, the common root cause of all his patient’s various symptoms and illnesses.

Had Freud stopped there, his contribution to psychology would have been fully commendable, for there is tremendous insight in these ideas. He says too much no doubt (especially when it comes to the specifics of human development), but he also says something that needed to be said very urgently: that if you force people to behave against their natures you will make them sick. So it seems a pity that Freud carried some of the ideas a little too far.

Let’s take the ‘Oedipus complex’, which of the many Freudian features of our supposed psychological nether regions, is without doubt the one of greatest notoriety. The myth of Oedipus is enthralling; the eponymous hero compelled to deal with fate, misfortune and prophesy. 10 Freud finds in this tale, a revelation of deep and universal unconscious repression, and though plausible and intriguing, his interpretation basically narrows its far grander scope:

“[Oedipus’s] destiny moves us only because it might have been ours – because the Oracle laid the same curse upon us before our birth as upon him. It is the fate of all of us, perhaps, to direct our first sexual impulse towards our mother and our first hatred and our first murderous wish against our father. Our dreams convince us that this is so.”11

Freud generally studied those with minor psychological problems (and did not deal with cases of psychosis), determining on the basis of an unhappy few, what he presumed true for healthier individuals too, and this is perhaps a failure of all psychoanalytic theories. For though it may seem odd that he came to believe in the universality of the Oedipus Complex, who can doubt that his clients didn’t suffer from something like it? Who can doubt that Freud didn’t suffer the same dark desires? Perhaps, he also felt a ‘castration anxiety’ as a result of the Oedipal rivalry he’d had with his own father. Maybe he actually experienced ‘penis envy’, if not of the same intensity as he said he detected in his female patients, but of a compensatory masculine kind! After all, such unconscious ‘transference’ of attitudes and feelings from one person to another – from patient onto the doctor, or vice versa in this relevant example – is another concept that Freud was first to identify and label.

*

Given the strait-laced age in which Freud had fleshed out his ideas, the swiftness with which these theories received widespread acceptance and acclaim seems surprising, although there are surely two good reasons why Freudianism took hold. The first is straightforward: that society had been very badly in need of a dose of Freud, or something very like Freud. After such excessive prudishness, the pendulum was bound to swing the other way. But arguably the more important reason – indeed the reason his theories have remained influential – is that Freud picked up the baton directly from where Darwin left off. By restricting his explanations to biological instincts and drives, Freudianism has the mantle of scientific legitimacy, and this is a vital determining factor that helped to secure its prominent position within the modern epistemological canon.

Following his precedent, students of Freud, most notably Carl Jung and Alfred Adler, also drew on clinical experiences with their own patients, but gradually came to the conclusion, for different reasons, that Freud’s approach was too reductionist, and that there is considerably more to a patient’s mental well-being than healthy appetites and desires, and thus more to the psychological underworld than solely matters of sex and death.

Where Freud was a materialist and an atheist, Jung went on to incorporate aspects of the spiritual into his extended theory of the unconscious, though he remained respectful to biology and keen to anchor his own theories upon an evolutionary bedrock. Jung nevertheless speculates following a philosophical tradition that owes much to Immanuel Kant, while also drawing heavily on personal experience, and comes to posit the existence of psychical structures he calls ‘archetypes’ operating again at the deepest levels within a collective unconscious; a shared characteristic due to our common ancestry.

Thus he envisions ‘the ego’ – the aspect of our psyche we identify as “I” – as existing in relation to an unknown and finally unknowable sea inhabited by autonomous entities which have their own life. Jung actually suggests that Freud’s Oedipus complex is just one of these archetypes, while he finds himself drawn by the bigger fish of the unconscious beginning with ‘The Shadow’ – what is hidden and rejected by the ego – and what he determines are the communicating figures of ‘Animus/Anima’ (or simply ‘The Syzygy’) – a compensatory masculine/feminine unconscious presence within, respectively, the female and male psyche – that prepare us for incremental and never-ending revelations of our all-encompassing ‘Self’.

This lifelong psychical development, or ‘individuation’, was seen by Jung as an inherently religious quest and he is unapologetic in proclaiming so; the religious impulse being a product too of human evolutionary development along with opposable thumbs and upright posture. More than a mere vestigial hangover, religion is, Jung says, fundamental to the deep nature of our species.

Unlike Freud, Jung was also invested in understanding how the human psyche varies greatly from person to person, and to these ends introduced new ideas about character types, adding ‘introvert’ and ‘extrovert’ to the psychological lexicon to draw a division between individuals characterised either by primarily subjective or objective orientations to life – an introvert himself, Jung was able to observe such a clear distinction. Meanwhile, greatly influenced by Friedrich Nietzsche’s “will to power”, Adler switched attention to issues of social identity and specifically to why people felt – in very many cases quite irrationally – inferior or superior amongst their peers. These efforts culminated in the development of his theory of the ‘inferiority complex’ – which might also be thought of as an aspect of the Jungian ‘Shadow’.

These different schools of psychoanalysis are not irreconcilable. They are indeed rather complimentary in many ways: Freud tackling the animal craving and want of pleasure; Jung looking for expression above and beyond what William Blake once referred to as “this vegetable world”; and Adler delving most directly into the mud of human relations, the pervasive urge to dominate and/or be submissive, and the consequences of personal trauma associated with interpersonal and societal inequalities.

Freud presumes that since we are biological products of Darwinian evolution, then our minds have been evolutionarily pre-programmed. Turning the same inquiry outward, Jung goes in a search of common symbolic threads within mythological and folkloric traditions, enlisting these as evidence for the psychological archetypes buried deep within us all. And though Jung held no orthodox religious views of his own, he felt comfortable drawing upon religious (including overtly Christian) symbolism. In one of his most contemplative passages, he wrote:

Perhaps this sounds very simple, but simple things are always the most difficult. In actual life it requires the greatest art to be simple, and so acceptance of oneself is the essence of the moral problem and the acid test of one’s whole outlook on life. That I feed the beggar, that I forgive an insult, that I love my enemy in the name of Christ—all these are undoubtedly great virtues. What I do unto the least of my brethren, that I do unto Christ.

But what if I should discover that the least amongst them all, the poorest of all beggars, the most impudent of all offenders, yea the very fiend himself—that these are within me, and that I myself stand in need of the alms of my own kindness, that I myself am the enemy who must be loved—what then? Then, as a rule, the whole truth of Christianity is reversed: there is then no more talk of love and long-suffering; we say to the brother within us “Raca,” and condemn and rage against ourselves. We hide him from the world, we deny ever having met this least among the lowly in ourselves, and had it been God himself who drew near to us in this despicable form, we should have denied him a thousand times before a single cock had crowed. 12

Of course, “the very fiend himself” is the Jungian ‘Shadow’, the contents of which without recognition and acceptance then inevitably remain repressed, causing these unapproachable and rejected aspects of our own psyche to be projected out on to the world. ‘Shadow projection’ onto others fills the world with enemies of our own imagining; and this, Jung believed, was the root of nearly all evil. Alternatively, by taking Jung’s advice and accepting “that I myself am the enemy who must be loved”, we come back to ourselves in wholeness. It is only then that the omnipresent threat of the Other diminishes, as the veil of illusion forever separating the ego and reality is thinned. And Jung’s psychological reunification also grants access to previously concealed strengths (the parts of the unconscious discussed at the top), further enabling us to reach our fullest potential. 13

Today there are millions doing “shadow work” as it is now popularly known: self-help exercises often combined with traditional practices of yoga, meditation or the ritual use of entheogens: so here is a new meeting place – a modern mash-up – of religion and psychotherapy. Quietly and individually, a shapeless movement has arisen almost spontaneously as a reaction to the peculiar rigours of western civilisation. Will it change the world? For better or worse, it already has.

Alan Watts who is best known for his Western interpretations of Eastern spiritual traditions and in particular Zen Buddhism and Daoism, here reads this same influential passage from one of Jung’s lectures in which he speaks of ending “the inner civil war”:

*

Now what about my joke at the top? What’s that all about? Indeed, and in all seriousness, what makes it a joke at all? Well, not wishing to delve deeply into theories of comedy, there is one structure that arises repeatedly and nearly universally: that the punch line to every joke relies on some kind of unexpected twist on the set up.

To illustrate the point, let’s turn to the most hackneyed joke of all: “Why did the chicken cross the road?” Here we find an inherent ambiguity that lies within use of the word ‘why’ and this is what sets up the twist. However, in the case of the joke about the psychiatrist and the man who thinks he’s a moth, the site of ambiguity isn’t so obvious. But here the humour I think comes down to alternative and finally conflicting notions of ‘belief’.

A brief digression then: What is belief? To offer a salient example, when someone tells you “I believe in God”, what are they intending to communicate? No less importantly, what would you take them to mean? Put differently, atheists will very often say “I don’t believe in anything” – so again, what are they (literally) trying to convey here? And what would a listener take them to mean? Because in all these instances the same word is used to describe similar but distinct attitudinal relationships to reality, when it is all-too-easy to presume that everyone is using the word in precisely the same way. But first, we must acknowledge that the word ‘belief’ actually carries two quite distinct meanings.

According to the first definition, it is “a mental conviction of the truth of an idea or some aspect of reality”. Belief in UFOs fits this criterion, as does a belief in gravity and that the sun will rise again tomorrow. How about belief in God? When late in life Jung was asked if he believed in God, he replied straightforwardly “I know”. 14 Others reply with the same degree of conviction if asked about angels, fairies, spirit guides, ghosts or the power of healing and crystals. As a physicist, I believe in the existence of atoms, electrons and quarks – although I’ve never “seen one”, like Jung I know!

So belief in this sense is more often than not grounded in a person’s direct experience/s which obviously doesn’t go to validate the objective truth of their belief. He saw a ghost. She was healed by the touch of a holy man. We ran experiments to measure the charge on an electron. Again, in this sense I have never personally known of anyone who did not believe in the physical reality of a world of solid objects – for who doesn’t believe in the existence of tables and chairs? In this important sense everyone has many convictions about the truth of reality, and we surely all believe in something – this applies even in the case of the most hardline of atheists!

But there is also a second kind of belief: “of an idea that is believed to be true or valid without positive knowledge.” The emphasis here is on the lack of knowledge or indeed of direct experience. So this belief involves an effort of willing on the part of the believer. In many ways, this is to believe in make-believe, or we might just say “to make-believe”; to pretend or wish that something is real: the suspension of disbelief. I believe in unicorns…

As a child, all religion had been utterly mystifying, since what was self-evidently make-believe – for instance a “holy ghost” and the virgin birth! – for reasons I was unable to fathom, would be held by others as sacrosanct. Based on my casual encounters with Christians, it also seemed evident that the harder you tried to make-believe in this maddening mystification of being, the better a person it made you! So here’s the point: when someone tells you they believe in God, is this all they actually mean? That they are trying with tremendous exertion, although little conviction, to make-believe in impossibility?

Indeed, is this striving alone mistaken not only as virtuous but as actual believing in the first sense? Yes, quite possibly – and not only for religious types. Alternatively, it may be that someone truly believes in God – or whatever synonym they choose to approximate to ‘cosmic higher consciousness’ – with the same conviction that all physicists believe in gravity and atoms. They may come to know ‘God’, as Jung did.

Now back to the joke and apologies for killing it: The man complains that he feels like a moth and this is so silly that we automatically presume his condition is entirely one of make-believe. But then the twist, when we learn that his actions correspond to his belief, which means, of course, he has true belief of the first kind. Finally, here’s my hunch then for why we find this funny: it spontaneously reminds us of how true beliefs – rather than make-believe – both inform reality as we perceive it, and fundamentally direct our behaviour. Yet we are always in the process of forgetting altogether that this is how we live too, until abruptly the joke reminds us again – and in our moment of recollecting, spontaneously we laugh.

Which also raises a question: To what extent do beliefs of the second ‘make-believe’ kind determine our behaviour too? Especially when the twin definitions show just how easy it can be to get confused over beliefs. Because as Kurt Vonnegut wrote in the introduction to his cautionary novel Mother Night: “This is the only story of mine whose moral I know”, continuing: “We are what we pretend to be, so we must be careful about what we pretend to be.” 15

*

I would like to return now to an idea I earlier disparaged, Dawkins’s concept ‘memes’: ideas, stories, and other cultural fragments, the development and transmission of which can be considered similar to the mutation and survival of genes. In evoking this concept of memes, Dawkins had hoped to wrest human behaviour apart from the rest of biology in order to present an account of how it came to be that our species alone is capable of surpassing the hardwired instructions encoded in our genes. For Dawkins this entailed some fleeting speculation upon the origins of human culture set out in the final pages of his popular science book, The Selfish Gene. Others later picked up on his idea and have reworked it into a pseudo-scientific discipline known as memetics; something I have already criticised.

In fact, the notion of some kind of evolutionary force actively driving human culture occurred to authors before Dawkins. In The Human Situation, for example, Aldous Huxley outlined his own thoughts on the matter, while already making the significant point that such kinds of “social heredity” must be along Lamarckian rather than Darwinian lines:

“While it is clear that the Lamarckian conception of the inheritance of acquired characteristics is completely unacceptable, and untrue biologically, it is perfectly true on the social, psychological and linguistic level: language does provide us means for taking advantage of the fruits of past experience. There is such a thing as social heredity. The acquisitions of our ancestors are handed down to us through written and spoken language, and we do therefore enjoy the possibility of inheriting acquired characteristics, not through germ plasm but through tradition.”

Like Dawkins, Huxley recognised that culture was the singular feature distinguishing our species from others. Culture on top of nature, dictated by education, religious upbringing, class status, and so forth, establishes the social paradigms according to which individuals in general behave. However, in Huxley’s version, as in Dawkins, this is only metaphorically an evolutionary process, while both evidently regard the process of cultural development as most similar to evolution in one key respect: that it is haphazard.

Indeed, Dawkins and Huxley are similarly keen to stress that human culture is therefore a powerful but ultimately ambiguous force that brings about good and ill alike. As Huxley continues:

“Unfortunately, tradition can hand on bad as well as good items. It can hand on prejudices and superstitions just as effectively as it can hand on science and decent ethical codes. Here again we see the strange ambivalence of this extraordinary gift.” 16

We might carry also these ideas a little further by adding a very important determinant of individual human behaviour which such notions of ‘memetics’ have tended to overlook. For memes are basically ideas, and ideas are, by definition, a product and manifestation of conscious thought and transmission; whereas people, on the other hand, as I have discussed above, often behave in ways that are in conflict with their conscious beliefs and desires, which means to some extent, we act according to mental processes that are beyond or even alien to our immediate understanding.

Acknowledging the influence of the unconscious on our thoughts and behaviours, my contention here is straightforward enough and I think hard to dispute: that just as our conscious minds are moulded and differentiated by local customs and conventions; our unconscious minds are presumably likewise formed and diversified. That, to offer a more concrete example, the Chinese unconscious that was shaped and informed by almost three millennia of Daoism, Buddhism and Confucianism, is likely to be markedly different from the unconscious mind of anyone of us raised within the European tradition. Besides the variations due to religio-philosophical upbringing, divergence is likely to be further compounded due to the wide disparities in our languages, with dissimilarities in all elements from vocabulary, syntax and morphology down to the use of characters rather than letters.

Native tongue (or mother tongue) is a very direct and primary filter that not only channels what we are able to articulate, but governs what we are able to fully conceptualise or even to think at all. 17 It is perfectly conceivable therefore that anyone who learned to communicate first in Mandarin or Cantonese will be unconsciously differentiated from someone who learnt to speak English, Spanish or Arabic instead. 18 Indeed, to a lesser degree perhaps, all who speak English as a first language may have an alternate, if more subtly differentiated unconscious relationship to the world, from those whose mother tongue is say French or German. 19

So now I come back to the idea of memes in an attempt to resurrect it in an altered form. Like Dawkins original proposal, my idea is not rigorous or scientific; it’s another hunch: a way of referencing perhaps slight but characteristic differences in the collective unconscious between nations, tribes and also classes of society. Differences that then manifest perhaps as neuroses and complexes which are entirely planted within specific cultural identities – a British complex, for instance (and certainly we talk of having “an island mentally”). We might say therefore that alongside the transmission of memes, we also need to include the transmission of ‘dremes’ – cultural fragments from our direct social environment that are unconsciously given and received.

*

If this is accepted, then my further contention is that one such dreme has become predominant all around the world, and here I am alluding to what might be christened the ‘American Dreme’. And no, not the “American Dream”, which is different. The American Dream is in fact an excellent example of what Dawkin’s labelled a meme: a cultural notion that on this occasion encapsulates a collection of ideas about how life can and ought to be. It says that life should be better, richer and fuller for everyone. Indeed, it is written indelibly into the American constitution in the wonderful phrase: “Life, Liberty and the pursuit of Happiness.” Because the American Dream is inspiring and has no doubt been tremendous liberation for many; engendering technological progress and motivating millions with hopes that anyone living in “The Land of Opportunity” “can make it” “from rags to riches” – all subordinate memes to encapsulate different aspects of the fuller American Dream.

E pluribus unum – “Out of many one” – is the motto inscribed on the scroll held so firmly by the beak of the bald eagle on the Seal of the United States. 20  Again, it is another sub-meme at the heart of the American Dream meme: an emblematic call for an unbound union between the individual and collective; inspiring a loose harmony poetically compared to the relationship of flowers in a bouquet – thus, not a mixing-pot, but a richer mosaic that maintains the original diversity.

Underlying this American Dream, a related sub-meme, cherishes “rugged individualism”. The aspiration of individuals, not always pulling together, nor necessarily in one direction, but constantly striving upwards: pulling themselves up by their own bootstraps! Why? Because according to the dream at least, if you try hard enough, then you must succeed. And though this figurative pulling yourself up by your own bootstraps involves a physical impossibility that contravenes Newton’s Laws, even this does not detract from the idea. Believers in the American Dream apparently don’t notice any contradiction, despite the fantastical image of their central metaphor. The dream is buoyed so high on hope, when deep down most know it’s actually a fairy tale.

So finally there is desperation and a sickliness about the American Dream. A harsh reality in which “The Land of Opportunity” turns out to be a steep-sided pyramid spanned by labyrinthine avenues that mostly run to dead-ends. A promised land but one riven by chasms as vast as the Grand Canyon; disparities that grew out of historical failures: insurmountable gulfs in wealth and real opportunity across a population always beset by class and racial inequalities. Indeed, the underclass of modern America is no less stuck within societal ruts than the underclass of the least developed regions on earth, and in relative terms many are worse off. 21 “It’s called the American Dream”, said the late, great satirist George Carlin, “because you have to be asleep to believe it”.

In short, to keep dreaming the American Dream involves an unresting commitment. Its most fervent acolytes live in a perpetually suspended state of ignorance or outright denial; denial of the everyday miseries and cruelties that ordinary Americans daily suffer: the ‘American Reality’.

Graphic from page 56 of Jean Kilbourne’s book Can’t Buy My Love: How Advertising Changes the Way We Think and Feel (originally published in hardcover in 1999 as Deadly Persuasion: ‘Why Women and Girls Must Fight the Addictive Power of Advertising’). It was an ad for a German marketing firm, contained within a decades-old issue of the trade journal ‘Advertising Age’:

3 being humans image -MAGA advert

But just suppose for a moment that the American Dream actually did come true. That America somehow escaped from this lingering malaise and blossomed into a land of real freedom and opportunity for all as it always promised to be. Yet still an unassailable problem remains. For as with every ascent, the higher you reach the more precarious your position becomes: as apes we have never entirely forgotten how branches are thinner and fewest at the top of the tree.

Moreover, built into the American Dream is its emphasis on material enrichment: to rise towards the heavens therefore means riding up and up and always on a mountain of stuff. And, as you rise, others must, in relative terms, fall. Not necessarily because there isn’t enough stuff to go around, but because success depends upon holding ownership of the greatest share. Which means that as the American Reality draws closer to the American Dream (and it could hardly get much further away), creating optimal social mobility and realisable opportunities for all, then even given this best of all circumstances, the rise of some at the expense of others will cultivate anxious winners and a disadvantaged underclass for whom relative material gain of the winners comes at their own cost of bearing the stigma of comparative failure.

Why am I not nearer the top of the tree? In the greatest land on earth, why do I remain subservient to the gilded elites? Worries that nowadays plague the insomniac hours of many a hopeful loser; of those who landed up, to a large extent by accidental circumstance, in the all-too-fixed trailer parks of “The Land of the Free” (yet another sub-meme – ironically linked to the country with the highest incarceration rate on earth).

But worse, there is an inevitable shadow cast by the American Dream: a growing spectre of alienation and narcissism that abounds from such excessive emphasis on individual achievement: feelings of inferiority for those who missed the boat, and superiority, for those who caught the gravy train. Manipulation is celebrated. Machiavellianism, narcissism and psychopathy come to reign. This shadow is part of what we might call the ‘American Dreme’; an unconscious offspring that contains within it a truly abysmal contrast to the American Dream which bore it. A dreme, that being carried upon the coat-tails of the Dream, was spread far and wide by Hollywood, by Disney, radiated out in radio and television transmissions, and in consequence is now becoming the ‘Global Dreme’.

Being unconscious of it, however, we are mostly unaware of any affliction whatsoever; the dreme being insidious, and thus very much more dangerous than the meme. We might even mistake it for something else – having become such a pandemic, we might easily misdiagnose it as a normal part of ‘human nature’.

*

Here is Chris Hedges again with his own analysis of modern day consumerism, totalitarian corporate power and living in a culture dominated by pervasive illusion:

“Working for the American Dream”, first broadcast by the BBC in July 2018 and embedded below, is American comedian Rich Hall’s affectionate though characteristically sardonic portrait of the nation’s foundational and persistent myth:

*

And the joke was hilarious wasn’t it? No, you didn’t like it….? Well, if beauty is in the eye of the beholder, comedy surely lies in the marrow of the funny bone! Which brings me to ask why there is comedy? More broadly, why is there laughter – surely the most curious human reflex of all – or its very closely-related reflex cousin, crying. In fact, the emission of tears from the nasolacrimal ducts other than in response to irritation of our ocular structures and purely for reasons of joy or sorrow is a very nearly uniquely human secretomotor phenomenon. (Excuse my Latin!) 22

The jury is still out on evolutionary function of laughing and crying, but when considered in strictly Darwinian terms (as current Science insists), it is hard to fathom why these dangerously debilitating and a potentially life threatening responses ever developed in any species. It is acknowledged indeed that a handful of unlucky (perhaps lucky?) people have literally died from laughter. So why do we laugh? Why do we love laughter, whether ours or others, so much? Your guess is as good as mine, and, more importantly, as good as Darwin’s:

Many curious discussions have been written on the causes of laughter with grown-up persons. The subject is extremely complex. Something incongruous or unaccountable, exciting surprise and some sense of superiority in the laugher, who must be in a happy frame of mind, seems to be the commonest cause. 23

Less generously, Thomas Hobbes, who explained all human behaviour in terms of gaining social advantage, wrote that:

Joy, arising from imagination of a man’s own power and ability, is that exultation of the mind which is called glorying… Sudden Glory, is the passion which maketh those grimaces called LAUGHTER; and is caused either by some sudden act of their own, that pleaseth them; or by the apprehension of some deformed thing in another, by comparison whereof they suddenly applaud themselves. 24

And indeed, it is true that a great deal of laughter is at the expense of some butt of our joking, however not all mockery involves an inflicted party and there’s a great deal more to humour and laughter than merely ridicule and contempt. So Hobbes’ account is at best a very desiccated postulation for why humans laugh, let alone what constitutes joy.

Indeed, Hobbes’ reductionism is evidently mistaken and misinformed not only by his deep-seated misanthropy, but also by a seeming lack of common insight which leads one to suspect that when it came to sharing any jokes, he just didn’t get it. But precisely what didn’t he get?

Well, apparently he didn’t get how laughter can be a straightforward expression of joie de vivre. Too French I imagine! Or that when we apprehend anything, this momentarily snaps us from a prior state of inattention and on the occasion of and finding amusement in an abrupt, often fleeting, but totally fresh understanding, the revelation itself may elicit laughter (as I already outlined above). Or that it is simply impossible to laugh authentically or infectiously unless you not only understand the joke, but fully acknowledge it. In this way, humour, if confessional, can be liberating at a deeply personal level, or if satirical, liberating at a penetrating societal level. Lastly (in my necessarily limited rundown), humour serves as a wonderfully efficient and entertaining springboard for communicating insight and understanding, especially when the truths are dry, difficult to grasp or otherwise unpalatable. Here is a rhetorical economy that Hobbes might actually have approved were it not for his somewhat curmudgeonly disposition.

And why tell a joke here? Just to make you laugh and take your mind off the gravity of the topics covered and still more grave ones to come? To an extent, yes, but also to broaden out our discussion, letting it drift off into related philosophical avenues. For existence is seemingly absurd, is it not? Considered squarely, full-frontal, what’s it all about…? And jokes – especially ones that work beyond rational understanding – offer a playful recognition of the nonsensicalness of existence and of our species’ farcical determination to comprehend it and ourselves fully. What gives us the gall to ever speculate on the meaning of life, the universe and everything?

Meanwhile, we are free to choose: do we laugh or do we cry at our weird predicament. Both responses are surely sounder than cool insouciance, since both are flushed with blood. And were we madder, we might scream instead, of course, whether in joy or terror. As Theseus says in Shakespeare’s A Midsummer Night’s Dream:

Lovers and madmen have such seething brains,
Such shaping fantasies, that apprehend
More than cool reason ever comprehends.
The lunatic, the lover, and the poet
Are of imagination all compact.

*

French existentialist Albert Camus famously made the claim: “There is but one truly serious philosophical problem and that is suicide.” 25 Camus was not an advocate of suicide, of course; far from it. In fact, he saw it as perfectly vain attempt to flee from the inescapable absurdity of life, something he believed we ought to embrace in order to live authentically. Indeed, Camus regarded every attempt to deny the primacy of ultimate meaninglessness of life in a universe that is indifferent to our suffering as a surrogate form of psychological suicide.

But rather than staring blankly into the abyss, Camus urges us to rebel against it. To face its absurdity without flinching, and through rebellion, by virtue of which we individually reconstruct the meaning of our lives afresh, albeit paradoxically, we shall then come to face extreme rationality. Although perhaps he goes too far, and reaches a point so extreme that few can follow: such a Sisyphean outlook being too desolate for most of us, and his exhortation to authenticity so impassioned that it seems almost infinitely taxing. 26 Kierkegaard’s “leap of faith” is arguably more forgiving of the human condition – but enough of philosophy. 27

This pause is meant for introspection. I have therefore presented an opportunity to reconsider how my interlude set out, not only by telling a joke – and hopefully one that made you smile if not laugh out loud – but also to reflect upon the beautiful wisdom encapsulated in Chuang Tzu’s dream of becoming a butterfly; mystical enlightenment from 4th century BC China, that clashes intentionally with the plain silliness of a doctor-doctor joke about a moth-man; a surreal quip about clinical diagnosis and psychiatry (something I shall be coming to consider next).

However, the running theme here is one of transformation, and at the risk of also killing Chuang Tzu’s message by dissection, I will simply add (unnecessarily from the Daoist perspective) that existence does appear to be cyclically transformative; on personal, collective and altogether cosmic levels, the conscious and unconscious, spiralling outwards – whether upward into light or downward into darkness – each perpetually giving rise to the other just like the everblooming of yang and yin. As maverick clinical psychiatrist R. D. Laing once wrote:

“Most people most of the time experience themselves and others in one way or another that I… call egoic. That is, centrally or peripherally, they experience the world and themselves in terms of a consistent identity, a me-here over against you-there, within a framework of certain ground structures of space and time shared with other members of their society… All religious and all existential philosophies have agreed that such egoic experience is a preliminary illusion, a veil, a film of maya—a dream to Heraclitus, and to Lao Tzu, the fundamental illusion of all Buddhism, a state of sleep, of death, of socially accepted madness, a womb state to which one has to die, from which one has to be born.” 28

Returning from the shadowlands of alienation to contemplate the glinting iridescent radiance of Tzu’s butterfly’s wings is an invitation to scrape away the dross of habituated semi-consciousness that veils the playful mystery of our minds. On a different occasion, Tzu wrote:

One who dreams of drinking wine may in the morning weep; one who dreams weeping may in the morning go out to hunt. During our dreams we do not know we are dreaming. We may even dream of interpreting a dream. Only on waking do we know it was a dream. Only after the great awakening will we realize that this is the great dream. And yet fools think they are awake, presuming to know that they are rulers or herdsmen. How dense! You and Confucius are both dreaming, and I who say you are a dream am also a dream. Such is my tale. It will probably be called preposterous, but after ten thousand generations there may be a great sage who will be able to explain it, a trivial interval equivalent to the passage from morning to night. 29

Thus the world about us is scarcely less a construct of our imagination than our dreams are, deconstructed by the senses then seamlessly reconstructed in its entirety. And not just reconfigured via inputs from the celebrated five gateways of vision, sound, touch, taste and smell, but all portals including those of memory, intuition, and even reason. After all, it is curious how we speak of having ‘a sense’ of reason, just as we do ‘a sense’ of humour. Well, do we have… a sense of reason and a sense of humour? If you have followed this far then I sense you may share my own.

Next chapter…

*

Richard Rohr is a Franciscan priest, author and teacher, who says that his calling has been “to retrieve and re-teach the wisdom that has been lost, ignored or misunderstood in the Judeo-Christian tradition.” Rohr is the founder of the Center for Action and Contemplation and academic dean of the CAC’s Living School, where he practises incarnational mysticism, non-dual consciousness, and contemplation, with a particular emphasis on how these affect the social justice issues of our time. Recently he shared his inspirational standpoint in an hour-long chat with ‘Buddha at the Gas Pump’s Rick Archer:

*

Addendum: anyone with half a brain

“The intuitive mind is a sacred gift and the rational mind is a faithful servant. We have created a society that honors the servant and has forgotten the gift.”

— attributed to Albert Einstein 30

*

The development of split-brain operations for the treatment of severe cases of epilepsy, which involves the severing of the corpus callosum, a thick web of nerves that allow communication between the two hemispheres, first drew attention to how left and right hemispheres have quite different attributes. Unfortunately, the early studies in this field produced erroneous since superficial notions about left and right brain functions that were in turn vulgarised and popularised when they percolated down into pop psychology and management theory. The left brain was said to generate language and logic; while it was only the right brain which supposedly dealt with feelings and was the creative centre. In reality, both hemispheres are involved in all aspects of cognition, and as a consequence the study of what is technically called the lateralisation of brain function fell to some extent into academic disrepute.

In fact, important differences do occur between the specialism of the left and right hemispheres, although as psychiatrist Iain McGilchrist proposes in this book The Master and His Emissary (which he sees as the proper roles of the right and left hemispheres respectively) 31, it is often better to understand the distinctions in terms of where conscious awareness is placed. In summary, the left hemisphere attends to and focuses narrowly but precisely on what is immediately in front of you, allowing you to strike the nail with the hammer, thread the eye of the needle, sort the wheat from the chaff (or whatever activity you might be actively engaged with), while the right hemisphere remains highly vigilant and attentive to the surroundings. Thus, the left brain operates tools and usefully sizes up situations, while the right brain’s immediate relationship to the environment and to our bodies makes it the mediator to social activities and to a far broader conscious awareness. However, according to McGilchrist, the left brain is also convinced of its primacy, whereas the right is incapable of comprehending such hierarchies, which is arguably the root of a problem we all face, since it repeatedly leads humans to construct societal arrangements and norms in accordance with left brain dominance and so to the inevitable detriment of less restricted right brain awareness.

Supported by many decades of research, this has become the informed view of McGilchrist, and given that his overarching thesis has merit – note that the basic distinctions between left and right brain awareness are uncontroversial and well understood in psychology, whereas what he sees as the socio-historical repercussions is more speculative – then it raises brain function lateralisation as major underlying issue that needs to be incorporated in any final appraisal of ‘human nature’, the implications of which McGilchrist propounds at length in his own writing. In the preface to the new expanded edition of The Master and His Emissary (2009), he writes:

I don’t want it to be possible, after reading this book, for any intelligent person ever again to see the right hemisphere as the ‘minor’ hemisphere, as it used to be called – still worse the flighty, impetuous, fantastical one, the unreliable but perhaps fluffy and cuddly one – and the left hemisphere as the solid, dependable, down-to-earth hemisphere, the one that does all the heavy lifting and is alone the intelligent source of our understanding. I might still be to some extent swimming against the current, but there are signs that the current may be changing direction.

*

Embedded below is a lecture given to the Royal Society of Arts (RSA) in 2010, in which he offers a concise overview of how according to our current understanding, the ‘divided brain’ has profoundly altered human behaviour, culture and society:

To hear these ideas contextualised within an evolutionary account of brain laterality, I also recommend a lecture given to The Evolutionary Psychiatry Special Interest Group of the Royal College of Psychiatrists in London (EPSIG UK) in 2018:

For more from Iain McGilchrist I also recommend this extended interview with physicist and filmmaker Curt Jaimungal, host of Theories of Everything, which premiered on March 29th:

*

Please note that for the purposes of ‘publishing’ here I have taken advantage of the option to incorporate hypertext links and embed videos – in order to distinguish additional commentary from the original text all newly incorporated text has been italised.

*

1  “insensible perceptions are as important to [the science of minds, souls, and soul-like substances] as insensible corpuscles are to natural science, and it is just as unreasonable to reject the one as the other on the pretext that they are beyond the reach of our senses.” from Preface of New Essays concerning Human Understanding  by Gottfried Leibnitz, first published in 1704, translation courtesy of Stanford Encyclopedia of Philosophy.

2 From Anthropology from a Pragmatic Point of View by Immanuel Kant, first published in 1798.

3 “The definition of Psychology may be best given… as the description and explanation of states of consciousness as such. By states of consciousness are meant such things as sensations, desires, emotions, cognitions, reasonings, decisions, volitions, and the like. Their ‘explanation’ must of course include the study of their causes, conditions, and immediate consequences, so far as these can be ascertained.” from opening paragraph of “Introduction: Body and Mind” from The Principles of Psychology, by William James, first published in 1892.

4 Extract taken from The Varieties of Religious Experience, from chapter on “The Sick Soul”.

5 Letter to his friend, Francis Child.

6 According to James, the first division of “the self” that can be discriminated is between “the self as known”, the me, and “the  self as knower”, the I, or “pure ego”. The me he then suggests might be sub-divided in a constituent hierarchy: “the material me” at the lowest level, then “the social me” and top-most “the spiritual me”. It was not until very much later in the 1920s when Freud had fully developed his own tripartite division of the psyche in id, ego and super-ego, a division that surely owes much to James.

7

In the spring of 1876, a young man of nineteen arrived in the seaside city of Trieste and set about a curious task. Every morning, as the fishermen brought in their catch, he went to meet them at the port, where he bought eels by the dozens and then the hundreds. He carried them home, to a dissection table in a corner of his room, and—from eight until noon, when he broke for lunch, and then again from one until six, when he quit for the day and went to ogle the women of Trieste on the street—he diligently slashed away, in search of gonads.

“My hands are stained by the white and red blood of the sea creatures,” he wrote to a friend. “All I see when I close my eyes is the shimmering dead tissue, which haunts my dreams, and all I can think about are the big questions, the ones that go hand in hand with testicles and ovaries—the universal, pivotal questions.”

The young man, whose name was Sigmund Freud, eventually followed his evolving questions in other directions. But in Trieste, elbow-deep in slime, he hoped to be the first person to find what men of science had been seeking for thousands of years: the testicles of an eel. To see them would be to begin to solve a profound mystery, one that had stumped Aristotle and countless successors throughout the history of natural science: Where do eels come from?

From an article entitled “Where Do Eels Come From?” written by Brooke Jarvis, published in New Yorker magazine on May 18, 2020. https://www.newyorker.com/magazine/2020/05/25/where-do-eels-come-from

8 In the BBC TV sci-fi comedy Red Dwarf (Series 1 Episode), the eponymous characters “Confidence and Paranoia” form an alternative superego-id partnership, existing as physical manifestations, which appear onboard as symptoms of Lister’s illness.

9 Fixing on specific erogenous zones of the body, Freud believed that libidinous desire shaped our psychological development in a very specific fashion, naturally progressing, if permitted, through early stages from oral, to anal, and, then reaching adulthood, to genital.

10 Jocasta, the queen of Thebes, is barren, and so she and her husband, the king Laius, decide to consult the Oracle of Delphi. The Oracle tells them that if Jocasta bears a son, then the son will kill his father and marry her. Later, when Jocasta does indeed have a son, Laius demands that a servant take the baby to a mountain to be abandoned, his ankles pinned together just in case. But Oracles are rarely mistaken, fate is hard to avoid, and so as it happens the servant spares the infant, giving him to a shepherd instead. Eventually, as fortune will have it, the infant is adopted by the king and queen of Corinth, and named Oedipus because of the swellings on his feet. Years pass. Then, one day Oedipus learns that the king and queen are not his parents, but when he asks them, they deny the truth. So Oedipus decides put the question to the Oracle of Delphi instead, who being an enigmatic type, refuses to identify his true parents, but foretells his future instead, saying that he is destined to kill his father and marry his mother. Determined to avoid this, Oedipus determines not to return home to Corinth, heading to, you guessed it, Thebes instead. He comes to an intersection of three roads and meets Laius driving a chariot. They argue about who has the right of way and then, in an early example of road rage, their rage spills into a fight and thus Oedipus unwittingly kills his real father. Next up, he meets the sphinx, who asks its famous riddle. This is a question of life and death, all who have incorrectly answered having been killed and eaten, but Oedipus gets the answer right and so obligingly the sphinx kills itself instead. Having freed the people of Thebes from the sphinx, Oedipus receives the hand of the recently widowed Jocasta in marriage. All is well for a while, but then it comes to pass that Jocasta learns who Oedipus really is, and hangs herself. Then, later again, Oedipus discovers that he was the murderer of his own father, and gouges his own eyes out.

11 Sigmund Freud, The Interpretation of Dreams, chapter V, “The Material and Sources of Dreams”

12 From an essay by C.G. Jung published in CW XI, Para 520. The word ‘Raca’ is an insult translated as ‘worthless’ or ‘empty’ taken from a passage in the Sermon on the Mount from Matthew 5:22.

13 Jung described the shadow in a key passage as “that hidden, repressed, for the most part inferior and guilt-laden personality whose ultimate ramifications reach back into the realm of our animal ancestors…If it has been believed hitherto that the human shadow was the source of evil, it can now be ascertained on closer investigation that the unconscious man, that is his shadow does not consist only of morally reprehensible tendencies, but also displays a number of good qualities, such as normal instincts, appropriate reactions, realistic insights, creative impulses etc”

From Jung’s Collected Works, 9, part 2, paragraph 422–3.

14 In response to a question in an interview completed just two years before his death by John Freeman and broadcast as part of the BBC Face to Face TV series in 1959. Asking about his childhood and whether he had to attend church, he then asked: “Do you now believe in God?” Jung replies: “Now? Difficult to answer… I know. I don’t need to believe I know.”

15 The quote in full reads: “This is the only story of mine whose moral I know. I don’t think it’s a marvelous moral, I just happen to know what it is: We are what we pretend to be, so we must be careful about what we pretend to be.” From Mother Night (1962) by Kurt Vonnegut.

16 The Human Situation is a collection of lectures first delivered by Aldous Huxley at the University of California in 1959. These were edited by Piero Ferrucci and first published in 1978 by Chatto & Windus, London. Both extracts here were taken from his lecture on “Language”, p 172.

17 This is the premise behind Orwell’s ‘Newspeak’ used in his dystopian novel Nineteen Eighty-Four. In Chapter 5, Syme, a language specialist and one of Winston Smith’s colleagues at the Ministry of Truth, explains enthusiastically to Winston:

“Don’t you see that the whole aim of Newspeak is to narrow the range of thought? In the end we shall make thoughtcrime literally impossible, because there will be no words in which to express it. Every concept that can ever be needed, will be expressed by exactly one word, with its meaning rigidly defined and all its subsidiary meanings rubbed out and forgotten.”

18 I should note that the idea proposed here is not altogether original and that the original concept of ‘linguistic relativity’ is jointly credited to linguists Edward Sapir and Benjamin Whorf who whilst working independently came to the parallel conclusion that (in the strong form) language determines thought or (in the weak form) language and its usage influences thought. Whorf also inadvertently created the urban myth that Eskimos have hundred words for snow after he wrote in a popular article “We [English speakers] have the same word for falling snow, snow on the ground, snow hard packed like ice, slushy snow, wind-driven snow – whatever the situation may be. To an Eskimo, this all-inclusive word would be almost unthinkable…” The so-called “Sapir-Whorf hypothesis” continues to inspire research in psychology, anthropology and philosophy.

19 After writing this, I then read Richard Dawkins The Ancestor’s Tale. Aside from being a most wonderful account of what Dawkins poetically describes as his ‘pilgrimage to the dawn of life’, here Dawkins also returns to many earlier themes of other books, occasionally moderating or further elucidating previous thoughts and ideas. In chapter entitled ‘the peacock’s tale’ [pp 278–280], he returns to speculate more about the role memes may have had on human development. In doing so he presents an idea put forward by his friend, the philosopher Daniel Dennett,  from his book “Consciousness Explained”, which is that local variation of memes is inevitable:

“The haven all memes depend on reaching is the human mind, but the human mind is itself an artifact created when memes restructure a human brain in order to make it a better habitat for memes. The avenues for entry and departure are modified to suit local conditions, and strengthened by various artificial devices that enhance fidelity and prolixity of replication: native Chinese minds differ dramatically from native French minds, and literate minds differ from illiterate minds.” And is it not also implicit here, that the unconscious brain will also be differently ‘restructured’ due to different environmental influences.

20 Barack Obama, who’s own election was acclaimed by some and witnessed by many as proof of the American Dream, recently compared E pluribus unuman Indonisian motto Bhinneka Tunggal Ika — unity in diversity.

“But I believe that the history of both America and Indonesia should give us hope. It is a story written into our national mottos. In the United States, our motto is E pluribus unum — out of many, one. Bhinneka Tunggal Ika — unity in diversity. (Applause.) We are two nations, which have traveled different paths. Yet our nations show that hundreds of millions who hold different beliefs can be united in freedom under one flag.” Press release (unedited) from The White House, posted November 10th, 2010: “remarks by the President at the University of Indonesia in Jakarta, Indonesia”

21 Summary of statistical analysis by the Center for American Progress, “Understanding Mobility in America”, by Tom Hertz, American University, published April 26th, 2006. Amongst the key findings was a discovery that “Children from low-income families have only a 1 percent chance of reaching the top 5 percent of the income distribution, versus children of the rich who have about a 22 percent chance [of remaining rich].” and that “By international standards, the United States has an unusually low level of intergenerational mobility: our parents’ income is highly predictive of our income as adults.” The report adds that “Intergenerational mobility in the United States is lower than in France, Germany, Sweden, Canada, Finland, Norway and Denmark. Among high-income countries for which comparable estimates are available, only the United Kingdom had a lower rate of mobility than the United States.”

Reproduced from an article entitled “Advertising vs. Democracy: An Interview with Jean Kilbourne” written by Hugh Iglarsh, published in Counterpunch magazine on October 23rd 2020. https://www.counterpunch.org/2020/10/23/advertising-vs-democracy-an-interview-with-jean-kilbourne/ 

22 In his follow-up to the more famous On the Origin of Species (1959) and The Descent of Man (1871), Charles Darwin reported in Chapter VI entitled “Special Expressions of Man: Suffering and Weeping” of his three major work The Expression of the Emotions in Man and Animals (1872), that:

I was anxious to ascertain whether there existed in any of the lower animals a similar relation between the contraction of the orbicular muscles during violent expiration and the secretion of tears; but there are very few animals which contract these muscles in a prolonged manner, or which shed tears. The Macacus maurus, which formerly wept so copiously in the Zoological Gardens, would have been a fine case for observation; but the two monkeys now there, and which are believed to belong to the same species, do not weep. Nevertheless they were carefully observed by Mr. Bartlett and myself, whilst screaming loudly, and they seemed to contract these muscles; but they moved about their cages so rapidly, that it was difficult to observe with certainty. No other monkey, as far as I have been able to ascertain, contracts its orbicular muscles whilst screaming.

The Indian elephant is known sometimes to weep. Sir E. Tennent, in describing these which he saw captured and bound in Ceylon, says, some “lay motionless on the ground, with no other indication of suffering than the tears which suffused their eyes and flowed incessantly.” Speaking of another elephant he says, “When overpowered and made fast, his grief was most affecting; his violence sank to utter prostration, and he lay on the ground, uttering choking cries, with tears trickling down his cheeks.” In the Zoological Gardens the keeper of the Indian elephants positively asserts that he has several times seen tears rolling down the face of the old female, when distressed by the removal of the young one. Hence I was extremely anxious to ascertain, as an extension of the relation between the contraction of the orbicular muscles and the shedding of tears in man, whether elephants when screaming or trumpeting loudly contract these muscles. At Mr. Bartlett’s desire the keeper ordered the old and the young elephant to trumpet; and we repeatedly saw in both animals that, just as the trumpeting began, the orbicular muscles, especially the lower ones, were distinctly contracted. On a subsequent occasion the keeper made the old elephant trumpet much more loudly, and invariably both the upper and lower orbicular muscles were strongly contracted, and now in an equal degree. It is a singular fact that the African elephant, which, however, is so different from the Indian species that it is placed by some naturalists in a distinct sub-genus, when made on two occasions to trumpet loudly, exhibited no trace of the contraction of the orbicular muscles.

The full text is uploaded here: https://www.gutenberg.org/files/1227/1227-h/1227-h.htm#link2HCH0006

23 Quote from The Expression of the Emotions in Man and Animals (1872), Chapter VIII “Joy, High Spirits, Love, Tender Feelings, Devotion” by Charles Darwin. He continues:

The circumstances must not be of a momentous nature: no poor man would laugh or smile on suddenly hearing that a large fortune had been bequeathed to him. If the mind is strongly excited by pleasurable feelings, and any little unexpected event or thought occurs, then, as Mr. Herbert Spencer remarks, “a large amount of nervous energy, instead of being allowed to expend itself in producing an equivalent amount of the new thoughts and emotion which were nascent, is suddenly checked in its flow.” . . . “The excess must discharge itself in some other direction, and there results an efflux through the motor nerves to various classes of the muscles, producing the half-convulsive actions we term laughter.” An observation, bearing on this point, was made by a correspondent during the recent siege of Paris, namely, that the German soldiers, after strong excitement from exposure to extreme danger, were particularly apt to burst out into loud laughter at the smallest joke. So again when young children are just beginning to cry, an unexpected event will sometimes suddenly turn their crying into laughter, which apparently serves equally well to expend their superfluous nervous energy.

The imagination is sometimes said to be tickled by a ludicrous idea; and this so-called tickling of the mind is curiously analogous with that of the body. Every one knows how immoderately children laugh, and how their whole bodies are convulsed when they are tickled. The anthropoid apes, as we have seen, likewise utter a reiterated sound, corresponding with our laughter, when they are tickled, especially under the armpits… Yet laughter from a ludicrous idea, though involuntary, cannot be called a strictly reflex action. In this case, and in that of laughter from being tickled, the mind must be in a pleasurable condition; a young child, if tickled by a strange man, would scream from fear…. From the fact that a child can hardly tickle itself, or in a much less degree than when tickled by another  person, it seems that the precise point to be touched must not be known; so with the mind, something unexpected – a novel or incongruous idea which breaks through an habitual train of thought – appears to be a strong element in the ludicrous.

24 Quote from, Leviathan (1651), The First Part, Chapter 6, by Thomas Hobbes (with italics and spelling as original). Hobbes continues:

And it is incident most to them, that are conscious of the fewest abilities in themselves; who are forced to keep themselves in their own favour, by observing the imperfections of other men. And therefore much Laughter at the defects of others is a signe of Pusillanimity. For of great minds, one of the proper workes is, to help and free others from scorn; and compare themselves onely with the most able.

Interestingly, Hobbes then immediately offers his account of weeping as follows:

On the contrary, Sudden Dejection is the passion that causeth WEEPING; and is caused by such accidents, as suddenly take away some vehement hope, or some prop of their power: and they are most subject to it, that rely principally on helps externall, such as are Women, and Children. Therefore, some Weep for the loss of Friends; Others for their unkindnesse; others for the sudden stop made to their thoughts of revenge, by Reconciliation. But in all cases, both Laughter and Weeping, are sudden motions; Custome taking them both away. For no man Laughs at old jests; or Weeps for an old calamity.

https://www.gutenberg.org/files/3207/3207-h/3207-h.htm#link2H_PART1

25 “Il n’y a qu’un problème philosophique vraiment sérieux : c’est le suicide.” Quote taken from The Myth of Sisyphus (1942) by Camus, Albert.  Translated by Justin O’Brien.

26 In Greek mythology Sisyphus was punished in hell by being forced to roll a huge boulder up a hill only for it to roll down every time, repeating his action for eternity. In his philosophical essay The Myth of Sisyphus (1942) Camus compares this unremitting and unrewarding task of Sisyphus to the lives of ordinary people in the modern world, writing:

“The workman of today works every day in his life at the same tasks, and this fate is no less absurd. But it is tragic only at the rare moments when it becomes conscious.”

In sympathy he also muses on Sisyphus’ thoughts especially as he trudges in despair back down the mountain to collect the rock again. He writes:

“You have already grasped that Sisyphus is the absurd hero. He is, as much through his passions as through his torture. His scorn of the gods, his hatred of death, and his passion for life won him that unspeakable penalty in which the whole being is exerted toward accomplishing nothing. This is the price that must be paid for the passions of this earth. Nothing is told us about Sisyphus in the underworld. Myths are made for the imagination to breathe life into them.”

Continuing:

“It is during that return, that pause, that Sisyphus interests me. A face that toils so close to stones is already stone itself! I see that man going back down with a heavy yet measured step toward the torment of which he will never know the end. That hour like a breathing-space which returns as surely as his suffering, that is the hour of consciousness. At each of those moments when he leaves the heights and gradually sinks toward the lairs of the gods, he is superior to his fate. He is stronger than his rock.

“If this myth is tragic, that is because its hero is conscious. Where would his torture be, indeed, if at every step the hope of succeeding upheld him? The workman of today works everyday in his life at the same tasks, and his fate is no less absurd. But it is tragic only at the rare moments when it becomes conscious. Sisyphus, proletarian of the gods, powerless and rebellious, knows the whole extent of his wretched condition: it is what he thinks of during his descent. The lucidity that was to constitute his torture at the same time crowns his victory. There is no fate that can not be surmounted by scorn.”

You can read the extended passage here: http://dbanach.com/sisyphus.htm

27 Søren Kierkegaard never actually coined the term “leap of faith” although he did use the more general notion of “leap” to describe situations whenever a person is faced with a choice that cannot be fully justified rationally. Moreover, in this instance the “leap” is perhaps better described as a leap “towards” or “into” faith that finally overcomes what Kierkegaard saw as an inherent paradoxical contradiction between the ethical and the religious. However, Kierkegaard never advocates “blind faith”, but instead recognises that faith ultimately calls for action in the face of absurdity.

In Part Two, “The Subjective Issue”, of his 1846 work and impassioned attack against Hegelianism, Concluding Unscientific Postscript to the Philosophical Fragments (Danish: Afsluttende uvidenskabelig Efterskrift til de philosophiske Smuler), which is known for its dictum, “Subjectivity is Truth”, Kierkegaard wrote:

“When someone is to leap he must certainly do it alone and also be alone in properly understanding that it is an impossibility… the leap is the decision… I am charging the individual in question with not willing to stop the infinity of [self-]reflection. Am I requiring something of him, then? But on the other hand, in a genuinely speculative way, I assume that reflection stops of its own accord. Why, then, do I require something of him? And what do I require of him? I require a resolution.”

28 R. D. Laing, The Politics of Experience  (Ballantine Books, N.Y., 1967)

29 Quoted from the book known as Zhuangzi (also transliterated as Chuang Tzu or Chuang Chou). Translation by Lin Yutang

30 Although in all likelihood a reworking of a passage from a book entitled The Metaphoric Mind: A Celebration of Creative Consciousness written by Bob Samples and published in 1976 in which the fuller passage reads [with emphasis added]:

“The metaphoric mind is a maverick. It is as wild and unruly as a child. It follows us doggedly and plagues us with its presence as we wander the contrived corridors of rationality. It is a metaphoric link with the unknown called religion that causes us to build cathedrals — and the very cathedrals are built with rational, logical plans. When some personal crisis or the bewildering chaos of everyday life closes in on us, we often rush to worship the rationally-planned cathedral and ignore the religion. Albert Einstein called the intuitive or metaphoric mind a sacred gift. He added that the rational mind was a faithful servant. It is paradoxical that in the context of modern life we have begun to worship the servant and defile the divine.

31 The book is subtitled The Divided Brain and the Making of the Western World

4 Comments

Filed under « finishing the rat race »