Tag Archives: Aldous Huxley

on the side of the angels

The following article is Chapter Three of a book entitled Finishing The Rat Race. All previously uploaded chapters are available (in sequence) by following the link above or from category link in the main menu, where you will also find a table of contents and a preface on why I started writing it.

*

What a piece of work is a man!

— William Shakespeare 1

*

Almost two decades ago, as explosions lit up the night sky above Baghdad, I was at my parents’ home in Shropshire, sat on the sofa, and watching the rolling news coverage. After a few hours we were still watching the same news though for some reason the sound was now off and the music system on.

“It’s a funny thing,” I remarked, between sips of whisky, and not certain at all where my words were leading, “that humans can do this… and yet also… this.” I suppose that I was trying to firm up a feeling. A feeling that arose in response to the unsettling juxtaposition of images and music, and that involved my parents and myself in different ways, as detached spectators. But my father didn’t understand at first, and so I tried again.

“I mean how can it be,” I hesitated, “that on the one hand we are capable of making such beautiful things like music, and yet on the other, we are the engineers of such appalling acts of destruction?” Doubtless I could have gone on elaborating, but there was no need. My father understood my meaning, and the evidence of what I was trying to convey was starkly before us – human constructions of the sublime and the atrocious side-by-side.

In any case, the question, being as it is, a question of unavoidable and immediate importance to all of us, sort of hangs in the air perpetually, although as a question, it is usually considered and recast in alternative ways – something I shall return to – while mostly it remains not merely unanswered, but unspoken. We treat it instead like an embarrassing family secret, which is best forgotten. Framed hesitantly but well enough for my father to reply, his answer was predictable too: “that’s human nature”; which is the quick and easy answer although it actually misses the point entirely – a common fallacy technically known as ignoratio elenchi. For ‘human nature’ in no way provides an answer but simply opens a new question. Just what is human nature? – This is the question.

The generous humanity of music and the indiscriminate but cleverly conceived cruelty of carpet bombing are just different manifestations of what human beings are capable of, and thus of human nature. If you point to both and say “this is human nature”, well yes –and obviously there’s a great deal else besides – whereas if you reserve the term only for occasions when you feel disapproval, revulsion or outright horror – as many do – then your condemnation is simply another feature of “human nature”. In fact, why do we judge ourselves at all?

So this chapter represents an extremely modest attempt to grapple with what is arguably the most complex and involved question of all questions. Easy answers are good when they cut to the bone of a difficult problem, however to explain man’s inhumanity to man as well as to his other fellow creatures, surely deserves a better and fuller account than that man is by nature inhumane – if for no other reason than that the very word ‘human’ owes its origins to the earlier form ‘humane’! Upon this etymological root is there really nothing else but vainglorious self-deception and wishful thinking? I trust that language is in truth less consciously contrived.

The real question then is surely this: When man becomes inhumane, why on this occasion or in this situation, but not on all occasions and under all circumstances? And how come we still use the term ‘inhumane’ at all, if being inhumane is so hard-wired into our human nature? The lessons to be learned by tackling such questions can hardly be overstated; lessons that might well prove crucial in securing the future survival of our societies, our species, and perhaps of the whole planet.

*

I        Monkey business

There are one hundred and ninety-three living species of monkeys and apes. One hundred and ninety-two of them are covered with hair.”

— Desmond Morris 2

*

The scene: just before sunrise about one million years BC, a troop of hominids are waking up and about to discover a strange, rectangular, black monolith that has materialised from nowhere. As the initial excitement and fear of this strange new object wears off, the hominids move closer to investigate. Attracted perhaps by its remarkable geometry, its precise and unnatural blackness, they reach out tentatively to touch it and then begin to stroke it.

As a direct, though unexplained consequence of this communion, one of the ape-men has a dawning realisation. Sat amongst the skeletal remains of a dead animal, he picks up one of the sun-bleached thigh bones and begins to swing it about. Aimless at first, his flailing attempts simply scatter the other bones of the skeleton. In time, however, he gains control and his blows increase in ferocity, until at last, with one almighty thwack, he manages to shatter the skull to pieces. It is a literally epoch-making moment of discovery.

The following day, mingling beside a water-hole, a fight breaks out. His new weapon in hand, our hero deals a fatal blow against the alpha male of a rival troop. Previously at the mercy of predators and reliant on scavenging to find their food, the tribe can now be freed from fear and hunger too. Triumphant, he is the ape-man Prometheus, and in ecstatic celebration of this achievement, he tosses the bone high into the air, whereupon, spinning up and up, higher and higher into the sky, the scene cuts from spinning bone into an orbiting space-craft…

*

Stanley Kubrick’s 2001: A space odyssey is enigmatic and elusive. Told in a sequence of related if highly differentiated parts, it repeatedly confounds the viewer’s expectations – the scene sketched above is only the opening act to Kubrick’s seminal science-fiction epic.

Kubrick said “you are free to speculate as you wish about the philosophical and allegorical meaning of the film” 3 So taking Kubrick at his word, I shall do just that – although not for every aspect of the film, but specifically for his first scene, up to and including that most revered and celebrated ‘match cut’ in cinema history, and its relationship to Kubrick’s mesmerising and seemingly bewildering climax: moments of transformation, when reality per se is re-imagined. Although on one level, at least, all of the ideas conveyed in this opening as well as the more mysterious closing scenes (more below) are abundantly clear. For Kubrick’s exoteric message involves the familiar Darwinian interplay between the foxes and the rabbits and their perpetual battle for survival, which is the fundamental driving force behind the evolutionary development of natural species.

Not that Darwin’s conception should to be misunderstood as war in the everyday sense, however, although this is a very popular interpretation; for one thing the adversaries in these Darwinian arm races, most often predator and prey, in general remain wholly unaware of any escalation in armaments and armour. Snakes, for example, have never sought to strengthen their venom, any more than their potential victims, most spectacularly the opossums that evolved to prey on them, made any conscious attempts to hone their blood-clotting agents. Today’s snake-eating opossums have extraordinary immunity to the venom of their prey purely because natural selection strongly favoured opossums with heightened immunity.

Of course, the case is quite different when we come to humankind. For it is humans alone who deliberately escalate their methods of attack and response and do so by means of technology. To talk of an “arms race” between species is therefore a somewhat clumsy metaphor for what actually occurs in nature – although Darwin is accurately reporting what he finds.

And there is another crucial difference between the Darwinian ‘arms race’ and the human variant. Competition between species is not always as direct as between predator and prey, and frequently looks nothing like a war at all. Indeed, it is more often analogous to the competitiveness of two hungry adventurers lost in a forest. For it may well be that both of our adventurers are completely unaware that somewhere in the midst of the forest there is a hamburger left on a picnic table. While neither adventurer may be aware of the presence of the other, yet they are – at least in a strict Darwinian sense – in competition, since if either one stumbles accidentally upon the hamburger, it happens that, and merely by process of elimination, the other has lost his chance of a meal. As competitors then, the faster walker, or the one with keener eyes, or the one with greatest stamina, will gain a very slight but significant advantage on the other. Thus, perpetual competition between individuals need never amount to war, or even to battles, and this is how Darwin’s ideas are properly understood.

In any case, such contests of adaptation, whether between predators and prey, or sapling trees racing towards the sunlight, can never actually be won. The rabbits may get quicker but the foxes must get quicker too, since if either species fails to adapt then it will not survive long. So it’s actually a perpetual if dynamic stalemate, with species trapped like the Red Queen in Alice Through the Looking-Glass, always having to keep moving ahead just to hold their ground – a paradox that evolutionary biologists indeed refer to as “the red queen hypothesis” 4.

We might still judge that both sides are advancing, since there is, undeniably, a kind of evolutionary progress, with the foxes growing craftier as the rabbits get smarter too, and so we might conclude that such an evolutionary ‘arms race’ is the royal road to all natural progress – although Darwin noted that other evolutionary pressures including, most notably sexual selection, has tremendous influence as well. We might even go further by extending the principle in order to admit our own steady technological empowerment, viewed objectively as being a by-product of our own rather more deliberate arms race. Progress thus assured by the constant and seemingly inexorable fight for survival against hunger and the elements, and no less significantly, by the constant squabbling of our warring tribes over land and resources.

Space Odyssey draws deep from the science of Darwinism, and spins a tale of our future. From bony proto-tool, slowly but inexorably, we come to the mastery of space travel. From terrestrial infants, to cosmically-free adults – this is the overarching story of 2001. But wait, there’s more to that first scene than immediately meets the eye. That space-craft which Kubrick cuts to; it isn’t just any old space-craft…

Look quite closely and you might see that it’s actually one of four space-craft, similar in design, which form the components of an orbiting nuclear missile base, and though in the film this is not as clear as in Arthur C. Clarke’s parallel version of the story (the novel and film were co-creations written side-by-side), the missiles are there if you peer hard enough.

So Space Odyssey is, at least on one level, the depiction of technological development, which, though superficially from first tool to more magnificent uber-tool (i.e., the spacecraft), is also – and explicitly in the novel – a development from the first weapon to what is, up to now, the ultimate weapon, and thus from the first hominid-cide to the potential annihilation of the entire human population. 5

Yet 2001, the year in the title, also magically heralds a new dawn for mankind: a dawn that, as with every other dawn, bursts from the darkest hours. The meaning therefore, as far as I judge it, is that we, as parts of nature, are born to be both creators and destroyers; agents of light and darkness. That our innate but unassailable evolutionary drive, dark as it can be, also has the potential to lead us to the film’s weirdly antiseptic yet quasi-mystical conclusion, and the inevitability of our grandest awakening – a cosmic renaissance as we follow our destiny towards the stars.

Asked in an interview whether he agreed with some critics who had described 2001 as a profoundly religious film, Kubrick replied:

“I will say that the God concept is at the heart of 2001—but not any traditional, anthropomorphic image of God. I don’t believe in any of Earth’s monotheistic religions, but I do believe that one can construct an intriguing scientific definition of God, once you accept the fact that there are approximately 100 billion stars in our galaxy alone, that its star is a life-giving sun and that there are approximately 100 billion galaxies in just the visible universe.”

Continuing:

“When you think of the giant technological strides that man has made in a few millennia—less than a microsecond in the cosmology of the universe—can you imagine the evolutionary development that much older life forms have taken? They may have progressed from biological species, which are fragile shells for the mind at best, into immortal machine entities—and then, over innumerable eons, they could emerge from the chrysalis of matter transformed into beings of pure energy and spirit. Their potentialities would be limitless and their intelligence ungraspable by humans.”

When the interviewer pressed further, inquiring what this envisioned cosmic evolutionary path has to do with the nature of God, Kubrick added:

“Everything—because these beings would be gods to the billions of less advanced races in the universe, just as man would appear a god to an ant that somehow comprehended man’s existence. They would possess the twin attributes of all deities—omniscience and omnipotence… They would be incomprehensible to us except as gods; and if the tendrils of their consciousness ever brushed men’s minds, it is only the hand of God we could grasp as an explanation.” 6

Kubrick was an atheist although unlike many atheists he acknowledged the religious impulse is an instinctual drive no less irrepressible than our hungers to eat and to procreate. This is so because at the irreducible heart of religion lies pure transcendence: the climbing up and beyond ordinary states of being. This desire to transcend whether by shamanic communion with the ancestors and animalistic spirits, monastic practices of meditation and devotion, or by brute technological means is something common to all cultures.

Thus the overarching message in 2001 is firstly that human nature is nature, for good and ill, and secondly that our innate capacity for reason will inexorably propel us to transcendence of our terrestrial origins. In short, it is the theory of Darwinian evolution writ large. Darwinism appropriated and repackaged as an updated creation story – a new mythology and surrogate religion that lends an alternative meaning of life. We will cease to worship nature or humanity, which is nature, it says, and if we continue to worship anything at all, our new icons will be representative only of Progress (capital P). Thus, evolution usurps god! Of course, the symbolism of 2001 can be given esoteric meaning too – indeed, there can never be a final exhaustive analysis of 2001 because like all masterpieces the full meaning is open to an infinitude of interpretations – and this I leave entirely for others to speculate upon.

In 1997, Arthur C. Clarke was invited by the BBC to appear on a special edition of the documentary series ‘Seven Wonders of the World’ (Season 2):

*

I have returned to Darwin just because his vision of reality has become the accepted one. And by acknowledging that human nature is indeed another natural outgrowth, it is always tempting to look to Darwin for answers. However, as I touched upon in the previous chapter, though Darwinism as biological mechanism is extremely well-established science, interpretations that follow from those established evolutionary principles differ, and this is especially the case when we try to make sense of patterns of animal behaviour: how much stress to place on our own innate biological drives remains an even more hotly contested matter. But if we are to adjudicate fairly on this point then it is worthwhile first to consider how Darwin’s own ideas had originated and developed.

In fact, as with all great scientific discoveries, we can trace a number of precursors including the nascent theory of his grandfather Erasmus, a founder member of the Lunar Society, who wrote lyrically in his seminal work Zoonomia:

“Would it be too bold to imagine, that in the great length of time, since the earth began to exist, perhaps millions of ages before the commencement of the history of mankind, would it be too bold to imagine, that all warm-blooded animals have arisen from one living filament, which THE GREAT FIRST CAUSE endued with animality, with the power of acquiring new parts, attended with new propensities, directed by irritations, sensations, volitions, and associations; and thus possessing the faculty of continuing to improve by its own inherent activity, and of delivering down those improvements by generation to its posterity, world without end!” 7

So doubtless Erasmus sowed the seeds for the Darwinian revolution, although his influence alone does not account for Charles Darwin’s central tenet that it is “the struggle for existence” which provides, as indeed it does, one plausible and vitally important mechanism in the process of natural selection, and thus, a key component in his complete explanation for the existence of such an abundant diversity of species. But again, what caused Charles Darwin to suspect that “the struggle for existence” necessarily involved such “a war of all against all” to begin with?

In fact, Darwin had borrowed this idea of “the struggle for existence”, a phrase that he uses as his title heading chapter three of The Origin of Species, directly from Thomas Malthus 8. And interestingly, Alfred Russell Wallace, the less remembered co-discoverer of evolutionary natural selection, who had reached his own conclusions entirely independently of Darwin’s work, was also inspired in part by thoughts of this same concept, which though ancient in origin was already widely attributed to Malthus.

However, the notion of “a war of all against all” traces back still further, at least as far back as the English Civil War, and to the writings of highly influential political philosopher, Thomas Hobbes. 9

So it is indirectly from the writings of these two redoubtable Thomases that much our modern thinking about Nature and therefore, by extension, about human nature, has drawn upon. It is instructive therefore to examine the original context from which the formation and development of Hobbes and Malthus’s own ideas occurred; contributions that have been crucial to the evolution not only of evolutionary thinking, but foundational to the development of post-enlightenment western civilisation. To avoid too much of a digression, I have decided to leave further discussion of Malthus and his continuing legacy for the addendum below, and to focus attention here solely on the thoughts and influence of Hobbes. But to get to Hobbes, who first devoted his attention to the study of the natural sciences and optics in particular, I’d like to begin with a brief diversion by way of my own subject, Physics.

*

The title of Thomas Pynchon’s most celebrated novel Gravity’s Rainbow published in 1973 darkly alludes to the ballistic flight path of Germany’s V2 rockets that fell over London during the last days of the Second World War. Pynchon was able to conjure up this provocative metaphor because by the time of the late twentieth century everyone knew perfectly well and seemingly from their own direct experience, that projectiles follow a symmetrical and parabolic arc. It is strange to think, therefore, that for well over a millennium people in the western world, including the most scholarly among them, had falsely believed that motion followed a set of quite different laws, presuming the trajectory of a thrown object, rather than following any sweeping arc, must be understood instead as comprised of two quite distinct phases.

Firstly, impelled upwards by a force the object was presumed to enter a stage of “unnatural motion” as it climbed away from the earth’s surface – its natural resting place – before eventually running out of steam, and then abruptly falling back to earth under “natural motion”. This is indeed a common sense view of motion – the view that every child can instantly recognise and immediately comprehend – although as with many common sense views of the physical world, it is absolutely wrong.

As a rather striking illustration of scientific progress, this shift in modern understanding was brought to my attention by a university professor who had worked it into an unforgettable demonstration that kicked off his lecture on error analysis. On the blackboard he first sketched out the two competing hypotheses: a beautifully smooth arc captioned ‘Galileo’ and then to the left of it, a pair of disconnected arrows indicating diagonally up and then vertically down labelled ‘Aristotle’. Obviously Galileo was about to win, but then came the punchline as he pulled out a balloon, slapped it at an approximate angle of forty-five degrees before we all watched it drift back to earth just as Aristotle would have predicted! With tremendous glee he then chalked an emphatic cross to dismiss Galileo’s model, before spelling out the message (if you didn’t understand) that above and beyond all the other considerations, it is essential to design your experiment and carry out observations with due care! 10

Now, legend tells us that Newton was sitting under an apple tree in his garden, unable to fathom what force could maintain the earth in its orbit around the sun, when all of a sudden an apple fell and hit him on the head. And if this is a faithful account of Newton’s Eureka moment, then the accidentally symbolism is striking. I might even venture to suggest that by implication it was this fall of Newton’s apple that redeemed humanity; snapping Newton and by extension all humanity spontaneously out of darkness and into an Age of Reason. For if expulsion from Eden involved eating an apple, symbolically at least, Newton’s apple paved the way for a new golden age. Or, as poet Alexander Pope wrote so exuberantly: “Nature and Nature’s laws lay hid in night: God said, Let Newton be! and all was light.” 11

Of course Newton’s journey into light had not been a solo venture, and as he said himself, “if I have seen further, it is by standing on the shoulders of giants.” 12

These predecessors and contemporaries whom Newton implicitly pays homage to would include Descartes, Huygens, and Kepler, although the name that stands tallest today is Galileo of course. For it was Galileo’s observations and insights that led more or less ineluctably to what today are called Newton’s Laws, and in particular Newton’s First Law, which states (in various formulations) that objects remain in uniform motion or at rest unless acted upon by a force.

This deceptively simple law has many surprising consequences. For instance, it means that when we see an object moving faster and faster or else slower and slower or – and this is an important point – changing its direction of motion, then we can deduce there must be a force impelling it. It also follows that there is a requirement for a force to arc the path of the earth about the sun, and, likewise, one causing the moon to revolve about the earth; hence gravity. Conversely, if an object is at rest (or moving in a straight line at constant speed – the law makes no distinction) then we know the forces acting on it must be balanced in such a way as to cancel to zero. Thus, we can tell purely from any object’s motion whether the forces acting on it are ‘in equilibrium’ or not.

An alternative way of thinking about Newton’s First Law requires the introduction of a related idea called ‘inertia’. This is the ‘reluctance’ of every object to change its motion, and, it transpires that the more massive the object, the greater its inertia – so here I am paraphrasing Newton’s Second Law. Given a situation in which there are no forces acting (so no resistive forces like friction or drag) then according to this law the object must travel continually with unchanging velocity. This completely counterintuitive discovery was arguably Galileo’s finest achievement and it is the principle that permits modern hyperloop technology – high speed maglev trains that run without friction through vacuum tunnels. It also permitted Galileo’s understanding of how the earth could revolve indefinitely around the sun and oddly without us ever noticing.

Where others had falsely presumed that the birds would get left behind if the earth was in motion, Galileo saw that the earth’s moving platform was no different in principle from a travelling ship, and that, just like onboard a ship, nothing will be left behind as it travels forward – this is easier to envisage if you imagine sitting on a train and recall how it feels at constant speed if the rails are smooth, such that you sometimes cannot even tell whether the train you are on or the one on the other platform is moving.

Of course, when Galileo insisted on a heliocentric reality, he was directly challenging papal authority and paid the inevitable price for his impertinence. Moreover, when he implored his opponents merely to look through his own telescope and see for themselves, they simply declined his honest invitation. Which is simply the nature of belief – not just religious variants but all forms – for such ‘confirmation bias’ lies deep within our nature, causing most of us to have little desire to make new discoveries or learn new facts if ever these threaten to disrupt our hard-won opinions on matters of central concern.

So finally the Inquisition in Rome tried him, and naturally enough they found him guilty, sentencing Galileo to lifelong house arrest with a strict ban on publishing his ideas. Given the age, this was comparatively lenient; two decades earlier the Dominican friar and philosopher Giordano Bruno, who amongst other blasphemies had dared to suggest the universe had no centre and that the stars were just other suns surrounded by planets of their own, was burned at the stake.

Today, our temptation is to regard the Vatican’s hostility to Galileo’s new science as a straightforward attempt to deny the reality purely because it devalues the Biblical story which places not just earth, but the holy city of Jerusalem at the centre of the cosmos. However, Galileo’s heresy actually strikes a more fundamental blow, since it challenges not only papal infallibility but the entire millennium-long Scholastic tradition – the tripartite dialectical synergy of Aristotle, Neoplatonism and Christianity – and by extension, the whole hierarchical establishment of the late medieval period and much more.

Prior to Galileo, as my professor illustrated so expertly with his hilarious balloon demonstration, the view had endured that all objects obeyed laws according to their inherent nature. Thus, rocks fell to earth because they were by nature ‘earthly’, whereas the sun and moon remained high above us because they were made of altogether more heavenly stuff. In short, things back then knew their place.

By contrast, Galileo’s explanation is startlingly egalitarian. Since according to his radical reinterpretation, not only do all things obey common laws – ones that apply no less resolutely to the great celestial bodies as to everyday sticks and stones. But longer impelled by their inherent nature – a living essence – everything is instead directed always and absolutely by blind external forces.

At a stroke the universe was reduced to base mechanics; the deepest intricacies of the stars and the planets (once gods) entirely akin to elaborate mechanisms. At a stroke, it is fair to say not only that Galileo had levelled all stuff, but in the process he effectively killed the cosmos; all stuff being compelled to obey the same laws because all stuff is inherently inert.

Now if Newton’s apple is a reworking of the Fall of Man as humanity’s redemption through scientific progress, then the best-known fable of Galileo (since the tale itself is again wholly apocryphal), is how he had once instructed an assistant to drop cannon balls of differing sizes from the Leaning Tower of Pisa in order to test how objects fell to earth, observing that they landed together simultaneously on the grass below.

In fact, this experiment was recreated by Apollo astronauts up on the moon’s surface where without the hindrance of any atmosphere, it was indeed observed that objects as remarkably different as a hammer and a feather will truly accelerate at the same rate, landing in the dust at precisely the same instant. This same experiment is also one I have also repeated in class, stood on a desk and surrounded by bemused students, who unfamiliar with the principle, are reliably astonished; since intuitively we all believe that the heavier weights must fall faster.

But digressions aside, the important point is this: Galileo’s thought experiment invokes a different Biblical reference. It is in fact a parable of sorts, reminding us all not to jump to unscientific assumptions and instead always “to do the maths”. And in common with Newton’s apple it recalls a myth from Genesis; in this case the Tower of Babel story, which was an architectural endeavour supposedly conceived at a time when the people of the world had been united and wished to build a short-cut to heaven. Afterwards, God decided to punish us all (as he likes to do) with a divide and conquer strategy; our divided nations additionally confused by the introduction of a multiplicity of languages. But then along came Galileo to unite us once more with his own gift, the universal application of a universal language called mathematics. For as he wrote:

Philosophy is written in this grand book, which stands continually open before our eyes (I say the ‘Universe’), but cannot be understood without first learning to comprehend the language and know the characters as it is written. It is written in mathematical language, and its characters are triangles, circles and other geometric figures, without which it is impossible to humanly understand a word; without these one is wandering in a dark labyrinth. 13

*

Thomas Hobbes was very well studied in the works of Galileo, and on his travels around Europe in the mid 1630s he may very well have visited the great man in Florence. 14 In any case, Hobbes fully adopts Galileo’s mechanistic conception of the universe and draws what he sees as its logical conclusion, interpolating from what is true for external nature and determining that this must also be true of human nature – a step Galileo never ventured.

All human actions, Hobbes posits, whether voluntary or involuntary, are the direct outcomes of physical bodily processes occurring inside our organs and muscles. 15 Of the precise mechanisms, he ascribes the origins to “insensible” actions that he calls “endeavours”; something he leaves for physiologists to study and comprehend. 16

Fleshing out this bio-mechanical model, Hobbes next explains how all human motivations – which he calls ‘passions’ – must necessarily function likewise on the basis of these material processes, are thereby similarly reducible to forces of attraction and repulsion; in his own terms ‘appetites’ and ‘aversions’. 17

In the manner of elaborate machines, Hobbes says, humans operate in accordance with responses that entail either the automatic avoidance of pain or the increase of pleasure; the manifestation of apparent ‘will’ being nothing more than our overarching ‘passion’ of all these lesser ‘appetites’. Concerned solely with improving his lot, Man, he concludes, is inherently ‘selfish’.

Having presented his strikingly modern conception of life as a whole and human nature more particularly, Hobbes next considers what he calls “the natural condition of mankind” (or ‘state of nature’) and this in turn leads him to consider why “there is always war of everyone against everyone”:

Whatsoever therefore is consequent to a time of War, where every man is Enemy to every man; the same is consequent to the time, wherein men live without other security, than what their own strength, and their own invention shall furnish them withall. In such condition, there is no place for Industry; because the fruit thereof is uncertain; and consequently no Culture of the Earth; no Navigation, nor use of the commodities that may be imported by Sea; no commodious Building; no Instruments of moving, and removing such things as require much force; no Knowledge of the face of the Earth; no account of Time; no Arts; no Letters; no Society; and which is worst of all, continual fear, and danger of violent death; And the life of man, solitary, poor, nasty, brutish, and short. 18

According to Hobbes, this ‘state of nature’ becomes inevitable whenever our laws and social conventions cease to function and no longer protect us from our otherwise fundamentally rapacious selves. Once civilisation gives way to anarchy, then anarchy, according to Hobbes, is inevitable hell because our automatic drive to improve our own situation comes into immediate conflict with every other individual. To validate this claim, Hobbes then reminds us of the fastidious counter measures everyone takes to defend against their fellows:

It may seem strange to some man, that has not well weighed these things; that Nature should thus dissociate, and render men apt to invade, and destroy one another: and he may therefore, not trusting to this Inference, made from the Passions, desire perhaps to have the same confirmed by Experience. Let him therefore consider with himself, when taking a journey, he arms himself, and seeks to go well accompanied; when going to sleep, he locks his doors; when even in his house he locks his chests; and this when he knows there be Laws, and public Officers, armed, to revenge all injuries shall be done him; what opinion he has of his fellow subjects, when he rides armed; of his fellow Citizens, when he locks his doors; and of his children, and servants, when he locks his chests. Does he not there as much accuse mankind by his actions, as I do by my words? 19

Hobbes is not making any moral judgment here, since he regards all nature, drawing no special distinctions for human nature, as equally compelled by these self-same ‘passions’ and so in his conceived ongoing war of all on all, objectively the world he sees is value neutral. As he continues:

But neither of us accuse mans nature in it. The Desires, and other Passions of man, are in themselves no Sin. No more are the Actions, that proceed from those Passions, till they know a Law that forbids them; which till Laws be made they cannot know: nor can any Law be made, till they have agreed upon the Person that shall make it. 20

We might conclude indeed that all’s fair in love and war because fairness isn’t the point, at least according to Hobbes. What matters here are the consequences of actions, and so Hobbes’ stance is surprisingly modern.

Nevertheless, Hobbes wishes to ameliorate the flaws he perceives in human nature, in particular those born of selfishness, by constraining behaviour to accord with what he deduces to be ‘laws of nature’: precepts and general rules found out by reason. This, says Hobbes, is the only way to overcome what is otherwise man’s sorry state of existence in which a perpetual war of all against all otherwise ensures everyone’s life is “nasty, brutish and short”. Thus to save us from a dreadful ‘state of nature’ he demands conformity to more reasoned ‘laws of nature’ – in spite of the seeming contradiction!

In short, not only does Hobbes’ prognosis speak to the urgency of securing a social contract, but his whole thesis heralds our bio-mechanical conception of life and of the evolution of life. Indeed, following from the tremendous successes of the physical sciences, Hobbes’ radical faith in materialism, which must have been extremely shocking to his contemporaries, has gradually come to seem commonsensical; so much so that its overlooked presumptions led philosopher Karl Popper to coin the phrase “promissory materialism”: adherents to the physicalist view casually relegating concerns about gaps in understanding as problems to be worked out in future – just as Hobbes does, of course, when he delegates the task of comprehending all human actions and ‘endeavours’ to the physiologists.

*

But is it really is the case, as Hobbes concludes, that individuals can be restrained from barbarism only by laws and social contracts? If so, then we might immediately wonder why acts of indiscriminate murder and rape are comparatively rare crimes given how these are amongst the toughest crimes of all to foil or to solve. By contrast, most people, most of the time, appear to prefer not to commit everyday atrocities, and it would be odd to suppose that they refrain purely because they fear arrest and punishment. Everyday experience tells us instead that most people don’t really have much inclination for committing violence or other acts of grievous criminal intent.

Moreover, if we look for supporting evidence of Hobbes’ conjecture then we can actually find an abundance that also refutes him. We know for instance that the appalling loss of life during the last world war would have been far greater still if it were not for a very deliberate lack of aim amongst the combatants. A lack of zeal for killing even during the heat of battle turns out to be the norm as US General S. L. A. Marshall learned from firsthand accounts gathered at the end of the war when he debriefed thousands of returning GIs in efforts to learn more about their combat experiences. 21 What he heard was almost too incredible: not only had three-quarters of combatants never actually fired at the enemy – not even when coming under direct fire themselves – but amongst those who did shoot a tiny two-percent had trained their weapons to kill the enemy.

Nor is this lack of bloodlust a modern phenomenon. At the end of Battle of Gettysburg during the American Civil War, the Union Army collected up the tens of thousands of weapons and discovered that the vast majority were still fully loaded. Indeed, more than half of the rifles had multiple loads – one had an incredible 23 loads packed all the way up the barrel. 22 Many of these soldiers had never actually pulled the trigger; the majority preferring to feign combat rather literally than fire off shots.

It transpires that contrary to the depictions of battles in Hollywood movies, by far the majority of servicemen take no pleasure at all in killing one another. Modern military training from Vietnam onwards has even developed methods to compensate for the ordinary lack of ruthlessness: heads are shaven, identities stripped, and conscripts are otherwise desensitised, turning men into better machines for war.

But then, if there is one day in history more glorious than any other surely it has to be the Christmas Armistice of 1914. The war-weary and muddied troops huddling for warmth in no-man’s land, sharing food, singing carols together, before playing the most beautiful games of football ever played: such outpourings of sanity in the face of lunacy that no movie screenplay could reinvent. Indeed, it takes artistic genius even to render such scenes of universal comradeship and brotherhood as anything other than sentimental and clichéd, and yet they happened nonetheless.

*

In his autobiography Hobbes relates that his mother’s shock on hearing the news of the approaching Spanish Armada had induced his premature birth, famously saying: “my mother gave birth to twins: myself and fear.” Doing his utmost to avoid getting caught up in the tribulations of the English Civil War, Hobbes lived through exceptionally fearful times, and doubtless this accounts for why his political theory reads like a reaction and an intellectual response to fear. But fear produces monsters and Hobbes’ solution to societal crisis involves an inbuilt tolerance for tyranny. In fact Hobbes understood perfectly well that the power to protect is derived from the power to terrify; indeed to kill.

In response, Hobbes manages to conceive of a system of government whose authority is sanctioned – indeed sanctified – through terrifying its subjects to consent to their own subjugation. On this same Hobbesian basis, if a highwayman demands “your money or your life?” by agreeing you are likewise entered into a contract! In short, this is government by way of protection racket; Hobbes’ keenness for an overarching unassailable but (hopefully) benign dictatorship perhaps best captured by the absolute power he grants the State right down to the foundational level of determining morality as such:

I observe the Diseases of a Common-wealth, that proceed from the poison of seditious doctrines; whereof one is, “That every private man is Judge of Good and Evil actions.” This is true in the condition of mere Nature, where there are no Civil Laws; and also under Civil Government, in such cases as are not determined by the Law. But otherwise, it is manifest, that the measure of Good and Evil actions, is the Civil Law… 23

Keeping in mind that for Hobbes every action proceeds from a mechanistic cause, it follows that the very concept of ‘freedom’ actually struck him as a logical fallacy. Indeed, as someone who professed to be able to square the circle 24 – which led to a notoriously bitter mathematical dispute with Oxford professor John Wallis – Hobbes explicit dismissal of ‘freedom’ is suitably fitting:

[W]ords whereby we conceive nothing but the sound, are those we call Absurd, insignificant, and Non-sense. And therefore if a man should talk to me of a Round Quadrangle; or Accidents Of Bread In Cheese; or Immaterial Substances; or of A Free Subject; A Free Will; or any Free, but free from being hindred by opposition, I should not say he were in an Error; but that his words were without meaning; that is to say, Absurd. 25

According to Hobbes then, freedom reduces absurdity – or to ‘a round quadrangle’! – a perspective that understandably opens the way for totalitarian rule: and perhaps no other thinker was ever so willing as Hobbes to trade freedom for the sake of security. But finally, Hobbes is mistaken, as a famous experiment carried out originally by psychologist Stanley Milgram – and since repeated many times – amply illustrates.

*

For those unfamiliar with Milgram’s experiment, here is the set up:

Volunteers are invited to what they are told is a scientific trial investigating the effects of punishment on learning. Having been separated into groups, they are then assigned the roles either of teachers and learners. At this point, the learner is strapped into a chair and fitted with electrodes before in an adjacent room the teacher is given control of apparatus that enables him or her to deliver electric shocks. In advance of this, the teachers are given a low voltage sample shock just to give them a taste of the punishment they are about to inflict.

The experiment then proceeds with the teacher administering electric shocks of increasing voltage which he or she must incrementally adjust to punish wrong answers. As the scale on the generator approaches 400V, a marker reads “Danger Severe Shock” and beneath the final switches there is simply XXX. Proceeding beyond this level evidently runs the risk of delivering a fatal shock, but in the experiment participants are encouraged to proceed nonetheless.

How, you may reasonably wonder, could such an experiment have been ethically sanctioned? Well, it’s a deception. All of the learners are actors, and their increasingly desperate pleading is as scripted as their ultimate screams. Importantly, however, the true participants (who are all assigned as ‘teachers’) are led to believe the experiment and the shocks are for real.

The results – repeatable ones, as I say – are certainly alarming: two-thirds of the subjects will go on to deliver what they are told are potentially fatal shocks. In fact, the experiment is continued until a teacher has administered three shocks at 450V level, by which time the actor playing the learner has stopped screaming and must therefore be presumed either unconscious or dead.

“The chief finding of the study and the fact most urgently demanding explanation”, Milgram wrote later, is that:

Ordinary people, simply doing their jobs, and without any particular hostility on their part, can become agents in a terrible destructive process. Moreover, even when the destructive effects of their work become patently clear, and they are asked to carry out actions incompatible with fundamental standards of morality, relatively few people have the resources needed to resist authority. 26

Milgram’s experiment has occasionally been misrepresented as some kind of proof of our innate human capacity for cruelty and for doing evil. But this was neither the object of the study nor the conclusion Milgram makes. The evidence instead led him to conclude that the vast majority take no pleasure in inflicting suffering, but that surprising numbers will carry on nevertheless when they have been placed under a certain kind of duress and especially when an authority figure is instructing them to do so:

Many of the people were in some sense against what they did to the learner, and many protested even while they obeyed. Some were totally convinced of the wrongness of their actions but could not bring themselves to make an open break with authority. They often derived satisfaction from their thoughts and felt that – within themselves, at least – they had been on the side of the angels. They tried to reduce strain by obeying the experimenter but “only slightly,” encouraging the learner, touching the generator switches gingerly. When interviewed, such a subject would stress that he “asserted my humanity” by administering the briefest shock possible. Handling the conflict in this manner was easier than defiance. 27

Milgram thought that it is this observed tendency for compliance amongst ordinary people that had enabled the Nazis to carry out their crimes and that led to the Holocaust. But his study might also account for why those WWI soldiers, even after sharing food and songs with the enemy, returned ready to fight on in the hours, days, weeks and years that followed the Christmas Armistice. While disobedience was severely punished, often with the ignominy of court martial and the terror of a firing squad, it is likely that authority alone would be persuasive enough to ensure compliance for many of those stuck in the trenches. Most people will follow orders no matter how horrific the consequences – this is Milgram’s abiding message.

In short, what Milgram’s study shows is that Hobbes’ solution is, at best, deeply misguided, because it is authoritarianism (his proposed remedy) that mostly leads ordinary humans to commit the worst atrocities. So Milgram offers us a way of considering Hobbes from a top down perspective: addressing the issue of how obedience to authority influences human behaviour.

But what about the bottom up view? After all, this was Hobbes’ favoured approach, since he very firmly believed (albeit incorrectly) that his own philosophy was solidly underpinned by pure mathematics – his grandest ambition had been to derive an entire philosophy that follows logically and is directly derived from the theorems of Euclid. Thus, according to Hobbes’ derived but ‘promissory materialism’, which sees Nature as wholly mechanistic and reduces actions to impulse, all animal behaviours – including human ones – are fully accountable and ultimately determined by, to apply a modern phrase, ‘basic instincts’. But again, is this actually true? What does biology have to say on the matter, and most specifically, what are the findings of those who most closely study real animal behaviour?

*

This chapter is concerned with words rather than birds…

So writes pioneering British ornithologist David Lack who devoted much of his life to the study of bird behaviour, conducting field work for four years while he also taught at Dartington Hall School in Devon; his spare-time spent observing populations of local robins; his findings delightfully written up in a seminal work titled straightforwardly The Life of the Robin. The passage I am about to quote follows on from the start of chapter fifteen in which he presents a thoughtful aside under the heading “A digression upon instinct”. It goes on:

A friend asked me how swallows found their way to Africa, to which I answered, ‘Oh, by instinct,’ and he departed satisfied. Yet the most that my statement could mean was that the direction finding of migratory birds is part of the inherited make-up of the species and is not the result of intelligence. It says nothing about the direction-finding process, which remains a mystery. But man, being always uneasy in the presence of the unknown, has to explain it, so when scientists abolish the gods of the earth, of lightning, and of love, they create instead gravity, electricity and instinct. Deification is replaced by reification, which is only a little less dangerous and far less picturesque.

Frustrated by the types of misunderstanding generated and perpetuated by misuse of the term ‘instinct’, Lack then ventures at length into the variety of ambiguities and mistakes that accompany it both in casual conversation or academic contexts; considerations that lead him to a striking conclusion:

The term instinct should be abandoned… Bird behaviour can be described and analysed without reference to instinct, and not only is the word unnecessary, but it is dangerous because it is confusing and misleading. Animal psychology is filled with terms which, like instinct, are meaningless, because so many different meanings have been attached to them, or because they refer to unobservables or because, starting as analogies, they have grown into entities. 28

When I first read Lack’s book I quickly fell under the spell of his lucid and nimble prose and marvelled at how the love for his subject was infectious. As ordinary as they may seem to us, robins live surprisingly complicated lives, and all of this was richly told, but what stood out most was Lack’s view on instinct: if its pervasive stink throws us off the scent in our attempts to study bird behaviour, then how much more alert must we be to its bearing on perceived truths about human psychology? Lack ends his own brief digression with a germane quote from philosopher Francis Bacon that neatly considers both:

“It is strange how men, like owls, see sharply in the darkness of their own notions, but in the daylight of experience wink and are blinded.” 29

*

The wolves of childhood were creatures of nightmares. One tale told of a big, bad wolf blowing your house down to eat you! Another reported a wolf sneakily dressing up as an elderly relative and climbing into bed. Just close enough to eat you! Still less fortunate was the poor duck in Prokofiev’s enchanting children’s suite Peter and the Wolf, swallowed alive and heard in a climatic diminuendo quaking from inside his belly. When I’d grown a little older, I also came to hear about stories of werewolves that sent still icier dread coursing down my spine…

I could go on and on with similar examples because wolves are invariably portrayed as rapacious and villainous throughout folkloric traditions across the civilised world of Eurasia, which is actually quite curious when you stop to think about it. Curious because wolves are not especially threatening to humans and wolf attacks are comparatively rare occurrences – while other large animals including bears, all of the big cats, sharks, crocodiles, and even large herbivores like elephants and hippos, pose a far greater threat to us. To draw an obvious comparison, polar bears habitually stalk humans, and yet rather than being terrifying we are taught to see them as cuddly. Evidently, our attitudes towards the wolf have been shaped, therefore, by factors other than the observed behaviour of wolves themselves.

So now let us consider the rather extraordinary relationship our species actually has with another large carnivore: man’s best friend and cousin of the wolf, the dog – and incidentally, dogs kill (and likely have always killed) a lot more people than wolves.

The close association between humans and dogs is incredibly ancient. Dogs are very possibly the first animal humans ever domesticated, becoming so ubiquitous that no society on earth exists that hasn’t adopted them. This adoption took place so long ago in prehistory that conceivably it may have played a direct role in the evolutionary development of our species; and since frankly we will never know the answers here, I feel free to speculate a little. So here is my own brief tale about the wolf…

One night a tribe was sat around the campsite finishing off the last of their meal as a hungry wolf secretly watched on. A lone wolf, and being a lone wolf, she was barely able to survive. Enduring hardship and eking out a precarious existence, this wolf was also longing for company. Drawn to the smell of the food and the warmth of the fire, this wolf tentatively entered the encampment and for once wasn’t beaten back with sticks or chased away. Instead one of the elders at the gathering tossed her a bone to chew on. The next night the wolf returned, and the next, and the next, until soon she was welcomed permanently as one of the tribe: the wolf at the door finding a new home as the wolf by the hearth.

As a story, it sounds plausible enough that something like it may have happened countless times perhaps and in many locations. Having enjoyed the company of the wolf, the people of the tribe later adopting her cubs (or perhaps it all began with cubs). In any case, as the wolves became domesticated they changed, and within just a few generations of selective breeding, had been fully transformed into dogs.

The rest of the story is more or less obvious too. With dogs, our ancestors enjoyed better protection and could hunt more efficiently. Dogs run faster, have far greater endurance, keener hearing and smell. Soon they became our fetchers and carriers too; our dogsbodies. Speculating a little further, our symbiotic relationship might also have opened up the possibility for evolutionary development at a physiological level. Like cave creatures that lose pigmentation and in which eyesight atrophies to favour greater tactile sense or sonar 30, we likewise might have reduced acuity in those senses we needed less, as the dogs compensated for our loss, which might then have reset our brains to other tasks. Did losses in our faculties of smell and hearing enable more advanced dexterity and language skills? Did we perhaps also lose our own snarls to replace them with smiles?

I shan’t say much more about wolves, except that we know from our close bond with dogs that they are affectionate and loyal creatures. So why did we vilify them as the “big, bad wolf”? My hunch is that they represent symbolically, something we have lost, or perhaps more pertinently, that we have repressed in the process of our own domestication. In a deeper sense, this psychological severance involved our alienation from all of nature. It has caused us to believe, like Hobbes, that all of nature is nothing but rapacious appetite, red in tooth and claw, and that morality must therefore be imposed upon it by something other; that other being human rationality.

Our scientific understanding of wolf behaviour has been radically overturned. Previously accepted beliefs that wolves compete for dominance by becoming alpha males or females turn out to be largely untrue. Or at least this happens only if unrelated wolves are kept in captivity. In all cases where wolves are studied in their natural environment, the so-called ‘alpha’ wolves are just the parents – in other words, wolves form families just like we do:

*

One school views morality as a cultural innovation achieved by our species alone. This school does not see moral tendencies as part and parcel of human nature. Our ancestors, it claims, became moral by choice. The second school, in contrast, views morality as growing out of the social instincts that we share with many other animals. In this view, morality is neither unique to us nor a conscious decision taken at a specific point in time: it is the product of gradual social evolution. The first standpoint assumes that deep down we are not truly moral. It views morality as a cultural overlay, a thin veneer hiding an otherwise selfish and brutish nature. Perfectibility is what we should strive for. Until recently, this was the dominant view within evolutionary biology as well as among science writers popularizing this field. 31

These are the words of Dutch primatologist Frans de Waal, who became one of the world’s leading experts in chimpanzee behaviour. Based on his studies, de Waal applied the term “Machiavellian intelligence” to describe the variety of cunning and deceptive social strategies used by chimps. A few years later, however, de Waal came across their and our pygmy cousins the bonobos that were also captive in a zoo in Holland, and says they had an immediate effect on him:

“[T]hey’re totally different. The sense you get looking them in the eyes is that they’re more sensitive, more sensual, not necessarily more intelligent, but there’s a high emotional awareness, so to speak, of each other and also of people who look at them.” 32

Sharing a common ancestor with bonobos and chimps, humans are in fact equally closely-related to both species, and interestingly when de Waal was asked do you think we’re more like bonobo or chimp he replied:

“I would say there are people in this world who like hierarchies, they like to keep people in their place, they like law enforcement, and they probably have a lot in common, let’s say, with the chimpanzee. And then you have other people in this world who root for the underdog, they give to the poor, they feel the need to be good, and they maybe have more of this kinder bonobo side to them. Our societies are constructed around the interface between those two, so we need both actually.” 33

De Waals and others who have studied primates are often astonished by the kinship with our own species. When we look deep into the eyes of chimps, gorillas, or even those of our dogs, we find ourselves reflected in every way. It’s not hard to fathom where morality came from, and the ‘veneer theory’ of Hobbes reeks of a certain kind of religiosity, infused with a deep insecurity born of the hardship and terrors of civil strife.

*

New scientific studies are proving that primates, elephants, and other mammals including dogs also show empathy, cooperation, fairness and reciprocity. That morality is an aspect of nature. Here Frans de Waal shares some surprising videos of behavioral tests that show how many of these moral traits all of us share:

*

II       Between two worlds

I was of three minds,

Like a tree

In which there are three blackbirds

— Wallace Stevens 35

*

Of all the creatures on earth, apart from a few curiosities like the kangaroo and giant pangolin, or some species of long-since extinct dinosaurs, only the birds share our bipedality. The adaptive advantage of flight is so self-evident that there’s no need to ponder why the forelimbs of birds morphed into wings, but the case for humans is more curious. Why it was that about four million years ago, a branch of hominids chose to stand on two legs rather than four, enabling them to move quite differently from our closest living relatives (bonobos and chimps) with all of the physiological modifications this involved, still remains a mystery. But what is abundantly clear and beyond all speculation is that this single evolutionary change freed up our hands for purposes no longer restricted by their formative locomotive demands, and that having liberated our hands, not only did we become supreme manipulators of tools, but this sparked a parallel growth in intelligence, causing us to become supreme manipulators per se – the very etymological root of the word coming from ‘man-’ meaning ‘hand’ of course.

With our evolution as manual apes, humans also became constructors, and curiously here is another trait that we have in common with many species of birds. That birds are able to build elaborate structures to live in is indeed a remarkable fact, and that they necessarily achieve this by organising and arranging the materials using only their beaks is surely more remarkable again. Storks with their ungainly bills somehow manage to arrange large piles twigs so carefully that their nests often overhang impossibly small platforms like the tips of telegraph poles. House martins construct wonderfully symmetrical domes just by patiently gluing together globules of mud. Weaver birds, a range of species similar to finches, build the most elaborate nests of all, and quite literally weave their homes from blades of grass. How they acquired this ability remains another mystery, for though recent studies have found that there is a degree of learning involved in the styles and manner of construction, this general ability of birds to construct nests is an innate one. According to that throwaway term, they do it ‘by instinct’. By contrast, in one way or another, all human builders must be trained. As with so much about us, all our constructions are therefore cultural artefacts.

*

With very few exceptions, owls have yellow eyes. Cormorants instead have green eyes. Moorhens and coots have red eyes. The otherwise unspectacular satin bowerbird has violet eyes. Jackdaws sometimes have blue eyes. Blackbirds have extremely dark eyes – darker even than their feathers – jet black pearls set within a slim orange annulus which neatly matches their strikingly orange beaks. While eye colour is common to birds within each species, the case is clearly different amongst humans, where eye colour is one of a multitude of variable physical characteristics including natural hair and skin colour, facial characteristics, and height. Nonetheless, as with birds and other animals where there is significant uniformity, most of these colourings and other identifying features are physical expressions of the individual’s genetic make-up or genotype; an outward expression of genetic inheritance known technically as the phenotype.

Interestingly, for a wide diversity of species, there is an inheritance not only of morphology and physiology but also of behaviour. Some of these behavioural traits may then act in turn to shape the creature’s immediate environment – so the full phenotypic expression is often observed to operate outside and far beyond the body of the creature. These ‘extended phenotypes’ as Dawkins calls them are discovered within such wondrous but everyday structures as spider’s webs, delicate tube-like homes formed by caddis fly larvae, the larger scale constructions of beaver’s dams and of course bird’s nests. It is reasonable therefore to speculate on whether the same evolutionary principle applies to our human world.

What, for instance, of our own houses, cars, roads, bridges, dams, fortresses, cathedrals, systems of knowledge, economies, music and other works of art, languages…? Once we have correctly located our species as just one of amongst many, existing at a different tip of an otherwise unremarkable branch of our undifferentiated evolutionary tree of life, why wouldn’t we judge our own designs as similarly latent expressions of human genes interacting with their environment? Indeed, Dawkins addresses this point directly and points out that tempting as it may be, such broadening of the concept of phenotype stretches his ideas too far, since, to offer his own example, scientific justification must then be sought for genetic differences between the architects of different styles of buildings! 36

In fact, the distinction here is clear: artefacts of human conception which can be as wildly diverse as Japanese Noh theatre, Neil Armstrong’s footprints on the moon, Dadaist poetry, recipes for Christmas pudding, TV footage of Geoff Hurst scoring a World Cup hat-trick, and as mundane as flush toilets, or rarefied as Einstein’s thought experiments, are all categorically different from such animal artefacts as spider’s webs and beaver’s dams. They are patterns of culture not nature. Likewise, all human behaviour right down to the most ephemeral including gestures, articulations and tics, is profoundly patterned by culture and not fully shaped only by pre-existing and underlying patterns within our human genotypes.

Vocabulary – another human artefact – makes this plain. We all know that eggs are ‘natural’ whereas Easter eggs are distinguishable as ‘artificial’, and that the eye is ‘natural’ while cameras are ‘technological’ with both of our antonyms deriving roots in words for ‘art’. Which means that while ‘nature’ is a strangely slippery noun that in English points to a whole host of interrelated objects and ideas, it is found nonetheless that throughout other languages equivalent words do exist to distinguish our manufactured worlds – of arts and artifice – from the surrounding physical world comprised solely of animals, plants and landscapes. A reinvention of this same word-concept that occurs for a simple yet important reason: the difference it labels is inescapable.

*

As a species, we are incorrigibly anthropomorphising; constantly imbuing the world with our own attributes and mores. Which brings up a related point: what animal besides the human is capable of reimagining things in order to make them conform to any preconceived notion of any kind? Dogs may mistake us as other dogs – although I doubt this – but still we are their partners within surrogate packs, and thus, in a sense, surrogate dogs. But from what I know of dogs, their world is altogether more direct. Put simply it is… stick chasing… crap taking… sleep sleeping… or (best of all) going for a walk, which again is more straightforwardly being present on an outdoor exploration! In short, dogs live so close to the passing moment, because they have nowhere else to live. Yet humans mostly cannot. Instead we drift in and out of our past or in anticipation of our future. Recollections and goals fill our thoughts repeatedly and it is exceedingly difficult to attend fully to the present.

Moreover, for us the world is nothing much without other humans. Without culture, any world worthy of the name is barely conceivable at all, since humans are primarily creatures of culture. Yes, there would still be the wondrous works of nature, but no art beyond, and no music except for the occasional bird-song and the wind in the trees: nothing but nothing beyond the things-in-themselves that surround us, and without other humans, no need to communicate our feelings about any of this. In fact, there could be no means to communicate at all, since no language could ever form in such isolation. Instead, we would float through a wordless existence, which might be blissful or grindingly dull, but either way our sense impressions and emotions would remain unnamed.

So it is extremely hard to imagine any kind of world without words, although such a world quite certainly exists. It exists for animals and it exists in exceptional circumstances for humans too. The abandoned children who have been nurtured by wild animals (very often wolves) provide an uneasy insight into this world beyond words. So too, for different reasons, do a few of the profound and congenitally deaf. On very rare occasions, these children have gone on to learn how to communicate, and when this happens, what they tell us is just how important language is.

*

In his book Seeing Voices, neurologist Oliver Sacks, describes the awakening of a number of remarkable individuals. One such was Jean Massieu. Almost without language until the age of fourteen, Massieu had become a pupil at Roch-Ambroise Cucurron Sicard’s pioneering school for the deaf. Astonishingly, he went on to become eloquent in both sign language and written French.

Based on Sicard’s original account, Sacks examines Massieu’s steep learning curve, and sees close similarities to his own experience with a deaf child. By attaching names to objects in the pictures Massieu would draw, Sicard was able to open the young man’s eyes. Labels that, to begin with, left his pupil “utterly mystified” were then abruptly understood as Massieu had “got it”. And here Sacks emphasises how Massieu understood not just an abstract connection between the pencil lines of his own drawing and the seemingly incongruous additional strokes of his tutor’s labels, but, almost instantaneously, he also recognised the value of such a tool: “… from that moment on, the drawing was banished, we replaced it with writing.”

The most magical part of Sacks’ retelling comes in the description of Massieu and Sicard’s walks together through the woods. “He didn’t have enough tablets and pencils for all the names with which I filled his dictionary, and his soul seemed to expand and grow with these innumerable denominations…” Sicard later wrote.

Massieu’s epiphany brings to mind the story of Adam who was set the task of naming of all the animals in Eden, and Sacks tells us:

“With the acquisition of names, of words for everything, Sicard felt, there was a radical change in Massieu’s relation to the world – he had become like Adam: ‘This newcomer to earth was a stranger on his own estates, which were restored to him as he learned their names.’” 37

This gift for language quite obviously sets us most apart from other creatures. Not that humans invented language from scratch, of course, since it grew up both with us and within us: one part phenotype and one part culture. It evolved within other species too, but for reasons unclear, we excelled, and as a consequence became adapted to live in two worlds, or as Aldous Huxley preferred to put it: we have become “amphibian”, in that we simultaneously occupy “the given and the home-made, the world of matter, life and consciousness and the world of symbols.” 38

Words and symbols enable us to relate the present to the past. We reconstruct it or perhaps reinvent it. Likewise with language we can envisage a future. This moves us outside Time. So it helps us to heal past wounds and to prepare for future events. Indeed, it anchors the world and our place within it, but, and correspondingly, it also detaches us from the immediate present.

For whereas many living organisms exist entirely within their immediate physical reality, human beings occupy a parallel ideational space where we are almost wholly embedded in language. Now think about that for a moment… no really do!

Stop reading this.

Completely ignore this page of letters, and silence your mind.

Okay, close your eyes and turn your attention to absolutely anything you like and then continue reading…

So here’s my question: when you were engaged in your thoughts, whatever you thought about, did you use words at all? Very likely you literally “heard” them: your inner voice filling the silence in its busy, if generally unobtrusive and familiar way. Pause again and now contemplate the everyday noise of being oneself.

Notice how exceedingly difficult it is to exist if only for a moment without any recourse to language.

Perhaps what Descartes really meant to say was: I am therefore I think!

For as the ‘monkey mind’ goes wandering off, instantly the words have crept back into our mind, and with our words comes this detachment from the present. Every spiritual teacher knows this, of course, recognising that we cannot be wholly present to the here and now while our mind darts off to visit memories, wishes, opinions, descriptions, concepts and plans: the same memories, wishes, opinions, descriptions, concepts and plans that gave us an evolutionary advantage over our fellow creatures. The sage also understands how the true art of meditation cannot involve any direct effort to silence our excitable thoughts, but only to ignore them. Negation of thought is not thinking no thought; it is not thinking at all: no words!

It is evident therefore how in this essential way we are indeed oddly akin to amphibious beings since we occupy and move between two distinct habitats. Put differently, our sensuous, tangible outside world of thinginess (philosophers sometimes call this ‘sense data’) is totally immersed within the inner realms of language and symbolism. So when we see a blob with eight thin appendages we very likely observe something spider-like. If we hate spiders then we are very likely to recoil from it. If we have a stronger aversion then we will recoil even after we are completely sure that it’s just a picture of a spider or, in extreme cases, a tomato stalk. On such occasions, our feelings of fear or disgust arise not as the result of failing to distinguish the likeness of a spider from a real spider, but from the power of our own imagination: we literally jump at the thought of a spider.

Moreover, words are sticky. They coagulate together in streams of association and these mould our future ideas. Religion = goodness. Religion = stupidity. If we hold the first opinion then crosses and pictures of saints will automatically generate a different affect than if we hold the latter. Or how about replacing the word ‘religion’ with say ‘patriotism’: obviously our perception of the world alters in a different way. In fact, just as the pheromones in the animal kingdom cause the direct transmission of behavioural effects between members of a species, the language secreted by humans is likewise capable of directly impacting the behaviour of others.

It has become our modern tendency to suppose automatically that the arrow which connects these strikingly different domains points unerringly in one direction: that language primarily describes the world, whereas the world as such is relatively unmoved by our descriptions of it. This is basically the presumed scientific arrangement. By contrast, any kind of magical reinterpretation of reality involves a deliberate reversal of the direction of the arrow such that all symbols and language are treated as potent agents that might actively cause change within the material realm. Scientific opinion holds that this is false, and yet, on a deeply personal level, language and symbolism not only comprise the living world, but do quite literally shape and transform it. As Aldous Huxley writes:

“Without language we should merely be hairless chimpanzees Indeed, we should be something much worse. Possessed of a high IQ but no language, we should be like the Yahoos of Gulliver’s Travels—creatures too clever to be guided by instinct, too self-centred to live in a state of animal grace, and therefore condemned to remain forever, frustrated and malignant, between contented apehood and aspiring humanity. It was language that made possible the accumulation of knowledge and the broadcasting of information. It was language that permitted the expression of religious insight, the formulation of ethical ideals, the codification of laws. It was language, in a word, that turned us into human beings and gave birth to civilization.” 39

*

As I look outside my window I see a blackbird sitting on the TV aerial of a neighbouring rooftop. This is what I see, but what does the blackbird see? Obviously I cannot know for certain though merely in terms of what he senses, we know that his world is remarkably different from ours. For one thing, birds have four types of cone cells in the retinas of their eyes while we have only three. Our cone cells collect photons centred on red, green and blue frequencies and different combinations generate a range of colours that can be graphically mapped as a continuously varying two-dimensional plain of colours, however if we add another colour receptor then the same mapping requires an additional axis that extends above the plain. For this reason we might justifiably say that the bird sees colours in ways that differ not merely by virtue of the extent of the detectable range of frequencies, but that a bird’s vision involves a range of colour combinations of a literally higher dimension.

Beyond these immediate differences in sense data, there is another way in which a bird’s perceptions – or more strictly speaking its apperceptions – are utterly different from our own, for though the blackbird evidently sees the aerial, it does not recognise it as such. Presumably it sees nothing beyond a convenient metal branch to perch upon decked with unusually regular twigs. For even the most intelligent of all blackbirds is incapable of knowing more, since this is all any bird can ever understand about the aerial.

No species besides our own is capable of discovering why the aerial was actually put there, or how it is connected to an elaborate apparatus that turns the invisible signals it captures into pictures and patterns of sounds, leave aside gathering the knowledge of how metal can be manufactured by smelting rocks or the still more abstruse science of electromagnetism.

My point here is not to disparage the blackbird’s inferior intellect, since it very possibly understands things that we cannot; but to stress how we are unknowingly constrained in ways we very likely share with the bird. As Hamlet cheeks his friend: “There are more things in heaven and earth, Horatio, than are dreamt of in your philosophy.”

Some of these things – and especially the non-things! – may slip us by forever as unknown unknowns purely by virtue of their inherently undetectable nature. Others may be right under our nose and yet, just like the oblivious bird perched on its metal branch who can never consider reasons for why it is there, we too may lack any capacity even to understand that there is any puzzle at all.

*

I opened the chapter with a familiar Darwinian account of human beings as apex predators struggling for survival on an ecological battlefield; perpetually fighting over scraps, and otherwise competing over a meagre share of strictly limited resources. It is a vision of reality founded upon our collective belief in scientific materialism, and although a rather depressing vision, it has become today’s prevailing orthodoxy – the Weltanschauung of our times – albeit seldom expressed so antiseptically as it might be.

Indeed, to boil this down further, as doctrinaire materialist hardliners really ought to insist, we might best comprehend ourselves as biological robots. Why robots? Because according to this shared doctrine humans are genetically coded not to experience life, or even purely for survival, but for reproductive success. This is our function – we consume, compete and procreate – and we are evolved to function for just such time as to fulfil this sole objective. Our death is indeed as inconsequential as it is inevitable.

Accordingly, propagation of every species goes on blindly until such time as a whole the species inevitably becomes extinct. If this process is extended by technological means beyond even the death of the earth and solar system, then it will end when the entire universe succumbs to its own overarching and insignificant end. No amount of space colonisation can finally save us from such a fate.

More nakedly told, it is not merely that, as Nietzsche famously lamented, “God is dead”, which has some upsides, but, that while richly animated, there is nothing going on whatsoever besides machine process, anywhere in this universe or the next. In fact, this reduction of the cosmos to machine process is Hobbes’ vision in a nutshell too.

In common with the old religions, the boundaries of this new mechanistic belief system extend boundless and absolute and thereby encompass whatever remnants of any god or gods we might try to salvage. There exists no location for any god within, or even the apparatus to exercise free will. Virtue, compassion and love are all epiphenomenal illusions. Remission comes in the form only of a compensatory genetic subroutine enabling us to carry on regardless of the painful irrelevance of our human situation.

Unsurprisingly, we seldom reflect on the deep existential ramifications of our given materialist mythos, which is, for the most part, unconsciously inculcated; and pretty much no-one lives a life in strict nihilistic accord. Instead, we mostly bump along trying to be good people (a religious hangover perhaps), with an outlook that approximates to the one most succinctly expressed by Morty Smith: “Nobody exists on purpose, nobody belongs anywhere, everybody’s gonna die. Come watch TV.” 39a

This is our modern story and we’re stuck with it. Unless, of course, we can dream up a better one…

*

III      Blinded by history

“All history is nothing but a continuous transformation of human nature”

— Karl Marx 40

*

History, someone once joked, is just one damn thing after another! A neat one-liner, since disassembling history by such vulgar reductio ad absurdum is amusing. And glanced at, whether by highlighting a few isolated and sporadic peculiarities or skipping across centuries in search of repetitions, the sequences of too-often terrible events may appear to follow with little to no apparent connection or purpose; the rise and fall of civilisations happening without rhyme nor reason and scarcely more intent than the random walk of a drunkard. Advancement may be admitted in both cases, of course, for in spite of deficiencies in one’s sense of direction the inebriated still generally make it back home!

Unfortunately, such disjointed views of history are actually rather hard to avoid. For one thing, there’s an awful lot of history out there and comparatively little time to learn about it. Nevertheless, any sort of ‘one damn thing after another’ approach, irrespective of the earth-shattering relevance of the facts in themselves, represents a kind of freeze-dried version; our human world shrivelled up to the most desiccated of husks, and completely devoid of the life that made it.

In fact, why bother studying it at all when it is so detached from reality and makes so little sense? To paraphrase Henry Ford, history thus reduced truly is bunk, although traditionally and especially at school, history has often been taught in this fashion: as one damned thing after another… all significant dates to be learned by rote.

By contrast, real historians are primarily interested in connecting the dots. Their goal is to reconstruct the past much as palaeontologists reconstruct dinosaurs by attempting to put plausible flesh back on to the real bones from excavations. Difficulties of similar kinds have to be confronted and overcome by experts in both fields. When you are working entirely from bones, all of the muscle, skin, fur, patterns of behaviour is added on the basis of what you know about living, or at least, less extinct creatures. If there is a close living relative then the task may be comparatively easy, less so when it’s a stegosaurus or t. rex. Likewise, it is obviously far easier to understand the motives and behaviour of people in societies anthropologically similar to our own. Once we venture into prehistory – something I am coming back to consider – this complication is massively compounded.

As a child, I learnt about an enormously long, herbivorous monster called the brontosaurus, although it transpires that no such creature ever walked the Earth… at least not quite such a creature. Its discoverer, Othniel Charles Marsh, in his rush to establish a new species, accidentally got some bones jumbled up. Worse than this, Marsh, having excavated a nearly complete skeleton – lacking only the skull – had creatively added a composite head constructed with finds from other locations. The brontosaurus he thought he’d discovered was in fact an adult specimen of an already classified group, the apatosaurus.

While palaeontologists depend on fossil records, of course, historians work from the surviving remnants of a quite different kind: books, documents, diaries, and during more recent times, photographs and audio-visual recordings. For interpretations beyond living memory (which is rather short) the historian is obliged to rely on such documentary sources. The difficulty faced is thereby magnified, since, unlike bones and rocks, human records can and do frequently distort the truth (both accidentally and wilfully – and human memory is extremely flaky).

How, then, does a scrupulous historian know which records to trust when faced with contradictory evidence? How to ascribe greater reliability to some sources above others? Or determine whether a freshly unearthed primary source is reliable or unreliable; authentic or a hoax? Well basically they turn detective and begin performing cross-checks, just as a good police detective or criminal lawyer will cross-examine witnesses to corroborate evidence and ascertain the truth. Although there remains an ineluctable circularity here – something palaeontologists do not encounter – since new records are commonly informed or founded on the basis of previous ones, and so updated accounts are generally preformed by the older stories.

In 1983, when the Hitler Diaries turned up out of the blue, they were quickly authenticated by three different expert historians, Hugh Trevor-Roper, Eberhard Jäckel and Gerhard Weinberg. Of course the diaries were shortly afterwards proven to be forgeries, and soon after that totally discredited by processes of direct forensic analysis. Handwriting turned out to be the biggest immediate give-away. This embarrassing episode is mostly forgotten today, although it remains instructive. Hitler had only been dead for half a century, well within living memory, and there were ample surviving handwritten documents to compare against. Such unassailable forensic evidence is the exception rather than the rule for the greatest tracts of history.

So historians have their work cut out, since if history is to be a living subject then even beyond the reliable facts surrounding its central events care must be taken to nurture the warm, moist uncertainty of the real lives that not just made it but lived it. On the one hand, history is a sketchbook, while on the other, as archaeologist and historian John Romer once elegantly put it: “History is only myth: stories trying to make sense of reality” 41

*

Two decades ago, I embarked an adventure to the USA. Travelling with Neil, a friend and post-graduate colleague, to the International Conference on Asteroids, Comets and Meteors in Flagstaff, Arizona, we were wined and dined and given tours of the Grand Canyon and Meteor Crater. It was a most splendid jolly!

After the conference, we took a tour to explore a little further into the great continent. We hired a car and headed west on Route 66, only reaching our final destination, San Francisco, after a solid week of driving. Along the way, we stopped to admire the great Hoover Dam, Las Vegas, Death Valley, Los Angeles, the giant redwoods and the towering rocks of Monument Valley which form such a spectacular backdrop to so many Westerns. En route we had also encountered the occasional roadside stalls where the Native Americans who sold trinkets would try to entice passing trade with off-road signs and promises of dinosaur footprints.

On one of these excursions in Arizona we had visited perhaps the most famous of all petrified forests (known straightforwardly as Petrified Forest National Park) with fossilised trees laid strewn like ancient bronze-casts, and nearby, where we also wandered the ruined remains of human settlements. The ruins had signs too, ones that told us the houses were built some six hundred years ago, or, as the notes put it: “prehistoric”. Well that had made us laugh, although we shouldn’t have. The idea that a mere six hundred years old could be designated “prehistoric” was not another fine example of dumbass American thinking, but a straightforward fact that two ignorant Europeans misunderstood: history, as I said above, is a discipline that arises purely out of documentation. Automatically, therefore, we – meaning all modern people – have, to put matters mildly, an historical bias.

At the risk of sounding worthy (or, in more current parlance ‘woke’), I’d like to draw attention to a few related misconceptions. First, Christopher Columbus did not discover America. Today most people are well aware of this indisputable fact and academics once marginalised simply for reminding us of this and other more painful truths are fully vindicated.

For one thing, literally millions of people were already living in North America prior to that fateful date of fourteen hundred and ninety-two: a forgotten civilisation. Today, having lost their land to settlers, most descendants remain on reservations, where they may earn a few bucks, lured from passing tourists with those promises of dinosaur footprints.

For one thing, literally millions of people were already living in North America: a forgotten civilisation. Today, having lost their land to settlers, most descendants remain on reservations, where they may earn a few bucks, lured from passing tourists with promises of dinosaur footprints.

But more than this, Columbus wasn’t the first European to sail to the ‘New World’. Again, as many people know today, the real honour goes to Erik Thorvaldsson – better known as Erik the Red – the Viking explorer credited in the Icelandic sagas with founding the first settlement in Greenland. Nor was Columbus the first European ever to set foot on continental American soil. The plaudits here go instead to Thorvaldsson’s son, Lief Erikson, who according to the sagas established a Norse settlement in Vinland, now called Newfoundland. All of this took place an astonishing five centuries before the voyage of Genoese pretender Columbus.

So, if not discovery, what did Columbus’ arrival really bring to this story? Well, the answer can be found and understood simply by reading between the lines of his captain’s log. Here, for instance, is what he writes about the ship’s first encounter with the Arawak Indians who inhabited the archipelago known today as the Bahamas:

They go as naked as when their mothers bore them, and so do the women, although I did not see more than one young girl. All I saw were youths, none more than thirty years of age. They are very well made, with very handsome bodies, and very good countenances… They neither carry nor know anything of arms, for I showed them swords, and they took them by the blade and cut themselves through ignorance… They should be good servants and intelligent, for I observed that they quickly took in what was said to them, and I believe they would easily be made Christians, as it appeared to me that they had no religion.

On the very next day, Columbus writes:

I was attentive, and took trouble to ascertain if there was gold. I saw that some of them had a small piece fastened in a hole they have in the nose, and by signs I was able to make out that to the south, or going from an island to the south, there was a king who had great cups full, and who possessed a great quantity.

The following day, a Sunday, Columbus decided to explore the other side of the island, and once again was welcomed by the villagers. He writes:

I saw a piece of land which appeared like an island, although it is not one, and on it there were six houses. It might be converted into an island in two days, though I do not see that it would be necessary, for these people are very simple as regards the use of arms, as your Highnesses will see from the seven that I caused to be taken, to bring home and learn our language and return; unless your Highnesses should order them all to be brought to Castile, or to be kept as captives on the same island; for with fifty men they can all be subjugated and made to do what is required of them. 42

Having failed in his original quest for gold, Columbus’ subsequent expeditions sought out a different cargo to bring back to Spain. In 1495, they corralled 1,500 Arawak men, women and children in pens and selected the fittest five hundred specimens for transportation. Two hundred died onboard the ships and the survivors were all sold in slavery. Unfortunately for Columbus, however, and by turns for the native people of the Caribbean, this trade in humans was insufficiently profitable to pay back his investors, and so Columbus adopted a different strategy and intensified his search for gold again.

In Haiti, where he believed the precious metal lay in greatest abundance, Columbus soon demanded that everyone over the age of fourteen must find and exchange a quarterly tribute for a copper token. Failure to comply was severely punished by the amputation of limbs; the victim left to bleed to death, and those who tried out of desperation to escape would be hunted down with dogs and then summarily executed.

Bartolome de las Casas, a young priest who had arrived to participate in the conquest and was indeed for a time a plantation owner, afterwards became an outspoken critic and reported on the many atrocities he witnessed. 43 In his own three-volume chronicle, History of the Indies, las Casas later wrote:

The Indians were totally deprived of their freedom and were put into the harshest, fiercest, most horrible servitude and captivity which no one who has not seen it can understand. Even beasts enjoy more freedom when they are allowed to graze in the field. 44

*

Napoleon has been attributed with the utterance that “History is written by the winners” or alternatively, “What is History but a fable agreed upon” 45, and for one with such a prodigious record both of winning and “making history”, who doubts that he knew whereof he spoke.

Strange, therefore, how little attention is generally paid to Napoleon’s straight-talking, no-nonsense maxim. How instead we eagerly absorb the authorised versions of our histories, trusting that by virtue of scholastic diligence and impartiality, these reconstructions of the past represent a close facsimile to the actuality of the real events.

Of course, when it comes to the centuries-long fractious infighting between the European monarchies, we are at least privy to the accounts of both adversaries. So in general we have – at minimum – two sides to each story of every conflict, plus competing and alternative versions to reports of criminal acts and in the case of many other scandals. In stark contrast, however, when the British and the other European powers sailed off to unconquered lands soon after to be known collectively as “the colonies”, only one side of the story remains extant.

For during the period of the last five hundred years or so, the era when western records have been most replete, a world once teeming with a diversity of alternative cultures, was slowly wiped away: the inhabitants of these forgotten worlds either annihilated or wholly assimilated by the great European powers. Thus, an increasingly homogeneous culture, by the terror of cannons and on other occasions by the softer coercions of the sermons of missionaries, has steadily erased and replaced the heterogeneous confusion very nearly as swiftly as it was encountered. Defeated cultures, if not entire indigenous populations, not just swept aside and defeated, but utterly and irreversibly deleted.

Oral traditions leave little if anything by way of an historical trace, and so back in the fifteenth century, America was indeed “prehistoric”; its history having been established only after the alien invaders first stepped ashore (and Europeans must surely have appeared to the wide eyes of the native peoples they were about to overwhelm, literally as creatures from another world). And as in the Americas, so too in Australia and the other ‘new worlds’, where, of the novelties we brought along, arguably the most significant was History itself.

Bear in mind, therefore, that throughout most regions of the world and most of human time, people didn’t have history at all, because history per se begins with writing; another largely Eurasian preoccupation. Thus history in most parts of the world starts with our arrival: its origins, an indirect consequence of conquest, oppression, exploitation and enslavement.

Pulitzer-prize winning journalist, author and activist Chris Hedges discusses the teaching of history as a form of indoctrination with Professor James W. Loewen, author of ‘Lies My Teacher Told Me: Everything Your American History Textbook Got Wrong’:

*

At this juncture, it is tempting to set out a comprehensive list of all the barbarisms of history (one damned thing after another), although to do justice I would certainly need to double the length of the current chapter. Instead, just a few examples more than serve my purpose of illustrating the point…

Invasions from the north took the dreadful shape of Viking longboats, their crews remembered today for rape and pillage; from the east, came the marauding Huns and then the Mongol horde, later followed by the butchery of tyrants such as Vlad the Impaler; in the Mediterranean south, entertainment was once provided by the sadistic spectaculars of the Roman circuses, and afterwards came the more ideologically entrenched, atrocities of the Spanish Inquisition. When the first Europeans had explored the lands of the west, the ruthless conquistadors came face to face with the blood-curdling atrocities of the Aztec and Mayan empires in which human sacrificial victims were regularly slaughtered in the hundreds and thousands. Which was the more fearsome and savage?

In former times, the Christians marched whole continents to slay innocents in the name of the Prince of Peace, and, in astonishingly recent times, other Christians dispatched heathens and heretics by drowning, burning and lynching, especially at the height of the witch craze that swept Europe and America well into the Enlightenment period.

Muslims, by comparison, have generally preferred to kill their enemies in the name of Jihad and Fatwa, or else to inflict judicial cruelties by means of stoning, flagellation, amputation and decapitation, all in strict accordance to their holy Sharia Law. But then the irreligious are no less diabolical, whether we consider Hitler and the Nazi death camps, or the Soviet gulags, or the killing fields of Cambodia, and Mao Tse-tung’s “Cultural Revolution” in China. Given how little time has passed since the decline of religion, the sheer number of victims tortured and murdered by these surrogate atheistic (or perhaps neo-pagan in the case of the Nazis) regimes is as gut-wrenching as it is perplexing.

Few have spoken with more forceful eloquence or erudition on the evils of religion than ardent atheist Christopher Hitchens. Sadly it was this same hatred of religion that in the end led Hitchens to join in the chorus calling for the neo-imperialist ‘war on terror’ and finally arguing the case for the ‘shock and awe’ bombing and subsequent invasion of Iraq at the cost of more than a million innocent lives in a 2003 collection of essays entitled A Long Short War: The Postponed Liberation of Iraq. One of Hitchens’ prime examples of religious authority making good people behave in morally repugnant ways is the barbarous practice of infant genital mutilation:

Britain itself witnessed centuries of religious intolerance, brutal repression and outright thuggery. Henry VIII, one of the most celebrated monsters in history, is chiefly remembered for his penchant for uxoricide, not to mention the land-grabbing and bloodletting of the English Reformation that followed from the convenience of his divorce from Catherine of Aragon. And like father, like daughter: a radical transformation of the sectarian landscape under Henry was partially undone by Bloody Mary’s reign of terror and her ultimately failed restoration of Catholicism (had she been more successful, doubtless her epithet would not now be “Bloody”).

Meanwhile, the sudden rise and spread of the British and other European empires meant such commonplace domestic atrocities could, during the next four hundred years, be committed as far afield as Africa, North and South America, India, China, and Australia. All of this facilitated by, and, in turn facilitating and encouraging, the international trade in human slaves. Of course, the European place in world history has been a repeatedly shameful one, but then man’s inhumanity to man can be legitimised and justified for a hundred other reasons beneath dozens of alternative flags. According to historical records then, human nature is infernally bad, and incurably so.

Cruel, bellicose, sneaky, and selfish; according to the historic record we ought to plead guilty on all counts. But then the historical record is a limited one, as outlined above – the meek have been disinherited from the world. Almost systematically so.

*

The French writer Voltaire is nowadays best remembered for his marvelous satire, Candide (1759), which he subtitled with characteristic irony: “or the Optimist”. A savage critique of the unenlightened politics and obscurantist metaphysics of his time, Candide is an historical fantasy, with many episodes in the book cleverly interwoven with factual events of the period. It is rightly celebrated, and I reference its central theme in the addendum below. A decade earlier, however, Voltaire had road-tested similar ideas, choosing not an historical backdrop, but one that we might today describe as science fiction. A forgotten classic, Voltaire’s Micromegas (1750) is a story about the adventures of two philosophical aliens. Here is a brief synopsis.

Micromegas, the eponymous hero, is a gigantic inhabitant of the star Sirius, who ventures to Earth, stopping off at Saturn along the way. Being many miles tall, the Saturnians who are themselves as tall as small hills, nevertheless appear to Micromegas as pigmies, and so his initial response is to deride them: “accustomed as he was at the sight of novelties, he could not for his life repress that supercilious and conceited smile which often escapes the wisest philosopher, when he [first] perceived the smallness of that globe, and the diminutive size of the inhabitants”. Eventually, however, and once the Saturnians ceased to be amazed by his gigantic presence, he befriends the secretary of the Academy of Saturn. Having discussed the comparative differences between their two worlds, Micromegas and the Saturnian resolve to set off on a grand tour of the Solar System. Shortly afterwards they arrive on Earth.

Upon landing, they decide to search around for evidence of intelligence but discover no signs of life at all except, eventually, for a whale, which the Saturnian catches between his fingers and shows to Micromegas, “who laughed heartily at the excessive smallness peculiar to the inhabitants of our globe”. As luck would have it, however, a ship of philosophers happens to be returning from a polar expedition, and aboard this ship, as the aliens soon encounter “a creature very different from the whale”.

Having established contact with the “intelligent atoms” aboard the ship, the alien philosophers are curious to learn about a life so “unencumbered with matter, and, to all appearance, little else than soul” conjecturing that such tiny earthlings must spend their lives “in the delights of love and reflection, which are the true enjoyments of the perfect spirit”. Of course, they are very quickly disabused of such idealist illusions by those on-board:

“We have matter enough,” said [one of the philosophers], “to do abundance of mischief, if mischief comes of matter; and too much understanding, if evil flows from understanding. You must know, for example, that at this very moment, while I am speaking, there are one hundred thousand animals of our own species, covered in hats, slaying an equal number of fellow-creatures who wear turbans; or else are slain by them; and this hath been nearly the case all over the earth from time immemorial…”

“The dispute is about a mud-heap, no bigger than your heal,” continued the philosopher. “It is not that any one of those millions who cut one another’s throats pretends to have the least claim to that clod; the question is to know, whether it shall belong to a certain person who is known by the name of Sultan, or to another whom (for what reason I know not) they dignify with the appellation Caesar. Neither the one nor the other has ever seen, or ever will see, the pitiful corner in question; and scarcely one of those wretches who slay one another hath ever beheld the animal on whose account they are mutually slain!”

Sadly, little has changed since Voltaire wrote his story more than two hundred and fifty years ago. 46

*

But now a related question: why did the Europe become such a dominant force in the first place? This, arguably, is the greatest, most important question in all of our History, though one that until contemporary times was met with the most hubristic of lame answers:

The white race is the most versatile, has the most initiative, a greater facility for organization, and a more practical outlook in life. This has led to its mastery of the material side of living, urged it to invention and discovery, and to the development of industry, commerce and science.

So begins an explication outlined under an horrifically racist heading “why is the white race dominant?” as quoted from a pre-war children’s ‘book of facts’ entitled How Much do You Know?; a copy of which I happen to own. The author’s deep-seated yet unconscious white supremacist mindset presumes such an excruciating air of colonial haughtiness, that immediately after the book summaries the other “races” as follows:

The black race, enervated by the heat of the tropics, has never shown great capacity for sustained or combined effort. The brown race, also found in hot climates, has produced the world’s main religions, and is excelled in artistic handicrafts. The yellow race is said still to have a slave mentality: the individual matters nothing, the community all. 47

When I showed this passage to my father he was rightly outraged. Those opinions were outdated and unacceptable when I was at school, he told me. But then my father went to school a full decade after the book’s publication. A world war had since been and gone. Perceptions and attitudes had evidently changed – greatly for the better.

And yet, if we hold our nose to the overwhelming stench of casual racism, there is within the same passage, one idea that might – if expressed more sensitively – resonate with a somewhat permissible and rather commonly held opinion that still abounds today:

It [the white race – Europeans] has had the advantage also of living for the most part in temperate climates, where the struggle for existence has been neither too difficult nor too easy.

In a sense, it was this very assumption that Jared Diamond attempted not so much to dispel, as to correct in his best-selling book, Guns, Germs and Steel. In pursuit of that end, he dedicated thirty years of life on the road, trying to understand precisely why Europe did come to dominate the world, and he makes the intriguing and largely convincing case that the roots to present global inequality were basically an outcome of freak circumstances and happenstance. Not simply “the advantage also of living for the most part in temperate climates”, although, according to Diamond at least, climate has had a vital part to play in the ascent of the West, but also due to other advantages conferred by location and historical timing.

His book begins by reminding us how the very origins of human civilisation in the Fertile Crescent of the Middle East depended upon the accidental occurrence of arable crops and animals suitable for domestication. These two factors opened the way to a land of plenty. For given that the rise of agriculture was inevitable, Diamond says, then since its origins so happened to occupy a central geographical location in the Eurasian landmass, which has the fortuitous geographical orientation in so much as this super-continent spreads out east and west, thus providing similar lengths of day, and of seasons and climates, then it was comparatively easy for these new modes of agriculture to propagate as the people slowly migrated. A led to B led to C if only because the rise of A, B and C was so perfectly compatible.

Thanks to the development of agriculture, the population enjoyed a surplus, and this in turn brought about the rise of trade, and no less importantly, of free-time. So the people in the new settlements would spend extended periods preoccupied with otherwise unproductive activities, such as making stylistic improvements to their houses and other amenities, rather than, as in former times, gathering nuts or trapping pigs. This new freedom resulted in the rise of new technologies which, with time to spare, could also then be refined – undoubtedly the most significant of which was the production of metals and development of metal-working skills. Plough shears that were later turned into swords.

Trade routes lead to the transmission of new ideas, and once the discovery of gunpowder in China reached the shores of the Middle East, then its military use was quickly perfected. It was thanks to the early invention of writing – which arose on a very few occasions worldwide, and just once outside of the super-continent of Eurasia with the development of Mayan Script in Mexico – that this steady transmission of ideas and innovations thereafter accelerated.

As a consequence, the Eurasian civilisations had everything in place to begin their takeover, and also a secret weapon in reserve which they weren’t even aware of – germs. Our 10,000 years of domestication of so many species had inadvertently equipped these Eurasian invaders with an arsenal of new biological agents: diseases they themselves had considerable immunity to: smallpox from cattle, chicken-pox and influenza from poultry, to name but three examples. Whereas in North and South America, many people did not live in such close proximity to domesticated animals, and so had neither immunity nor exotic infections of their own to spread. Conquests by war were thus very often followed by pandemics more devastating than even our swords and cannons – although more recently, once the genocidal effect of disease had been better understood, the contamination of Native Americans became chillingly deliberate. The rest is history… our history.

Following on the vanguard of conquerors and explorers, a variety of enterprising European settlers made land grabs for King and Country, and as the empires grew, so a few European superpowers came to dominance. According to Diamond’s version then, it was by virtue of the happenstance of circumstance, the stars very firmly in our favour, that these new kingdoms of the West were first won and then overrun.

The rise of agriculture, a fluke, and the inventions of the printing press and the gun, lucky but likely consequences, Diamond presents us with a timeline of evidence to show how European dominance had nothing to do with superior intelligence, or, even that less racist presupposition, superior ideology. We would have won with or without the Protestant work-ethic, and with or without the self-righteous and assertive arrogance that often comes with worship of a One True God; a god who permits unlimited belligerence for holy ends.

In reaching this conclusion, however, Diamond is surely being too much the professor of geography, the scientist, and the archaeologist, and not sufficiently the historian, because even his own evidence doesn’t entirely lend support to such an overarching claim. For when it came to Europe’s seizure of Africa, the tables were to some extent turned, the European settlers now highly susceptible to the ravages of tropical disease, and our advantages, including, of course, the superiority of our weaponry, more than ever buttressed by an unshakeable ideology: that pseudo-religio-scientific notion of racial superiority so imprinted on the minds of the colonisers. It is the European mindset that finally retilts the balance. For the natives needed “civilising”, and despite the ever-present dangers of famine and disease, more than enough Europeans were driven by the profit motive and a deep-seated belief in the virtue of “carrying the white man’s burden”.

*

Bruce Parry is an indigenous rights advocate, author, explorer and filmmaker. He has lived with some of the most isolated tribes in the world, learning from how they interact with each other and the planet. After much exploration, one of the things that has truly inspired Bruce is the idea of egalitarian living. In August 2019, Ross Ashcroft, host of RT’s ‘Renegade Inc.’ caught up with him to hear his ideas on how to we can rethink our leadership structures and muster the courage to look within so we are able to change the modern western narrative:

*

All of the stories we tell fall within two broad categories. First there are our quotidian tales of the everyday. What happened when and to whom. Loosely we might say that all of these are our ‘histories’ whether biographical, personal, anecdotal, or traditional histories that define nations, and where it may be noted the words ‘story’ and ‘history’ are synonymous in many languages. 48 But there are also stories of a second, more fundamental kind: those of fairytale, myth and allegory that sometimes arise as if spontaneously, and though deviating from the strict if mundane ‘truth of accountants’, are able to penetrate and bring to light otherwise occluded insights and wisdom.

Stories of the second kind have sprung forth in all cultures, often sharing common themes and characters. These include stories of creation; of apocalypse; of the wantonness of gods; of murder and revenge; of cosmic love and of battles between superheroes. Interestingly, the songlines of Australian aboriginals map their own stories of origin directly to the land. Less fantastical and wondrous, in the civilised world too, there are nationalistic versions of what might also be more loosely considered ‘songlines’. In England, for instance, we might trace the nation’s genealogy via Stonehenge, Runnymede, Sherwood Forest, Hastings, Agincourt, the white cliffs of Dover and Avalon (today called Glastonbury). Accordingly, Stonehenge tells us we are an ancient people; Runnymede that we are not slaves; Sherwood Forest that we are rebellious and cheer for the underdog; Hastings, Agincourt and the white cliffs of Dover that we are a warrior nation seldom defeated, in part because our isle is all but impregnable; while Avalon, to steal from Shakespeare, makes ours a “blessed plot”:

This royal throne of kings, this sceptred isle,
This earth of Majesty, this seat of Mars,
This other Eden, demi-paradise;
This fortress built by Nature for herself,
Against infection and the hand of war,
This happy breed of men, this little world,
This precious stone set in the silver sea,
Which serves it in the office of a wall,
Or as a moat defensive to a house,
Against the envy of less happier lands;
This blessed plot, this earth, this realm, this England… 49

So here we find history and myth entwined as unavoidably as if they were stories of a single kind. But then what is the past when it is not fully-fleshed and retold in stories? Unlike the rest of the extinct world, it cannot be preserved in jars of formaldehyde and afterwards pinned out on a dissecting table. To paraphrase George Orwell, the stories of our past are not just informed by the present, they are in part reconstituted from it, and thereafter those same stories ineluctably propel us into the future. Not that there is some future already fixed and inescapable, since we have no reason to presume it is, but that what unfolds is already prefigured in our stories, which then guide it like strange attractors, just as today’s world was prefigured by stories told yesterday. If things were otherwise, history would indeed be bunk – nothing more or less than a quaint curiosity. Instead it is an active creator, and all the more dangerous for that. 50

In 1971, Monty Python appeared in an hour-long May Day special showcasing the best of European TV variety. Python’s contribution was a six-minute piece describing traditional May Day celebrations in England, including the magnificent Lowestoft fish-slapping dance [at 2:30 mins]. It also featured as part of BBC2’s “Python Night” broadcast in 1999:

*

IV      Mostly Harmless

“Human nature is not of itself vicious”

— Thomas Paine 51

*

In the eyes of many today, it follows that since our evil acts far exceed our good deeds, and indisputably so given the innumerable massacres, pogroms, genocides and other atrocities that make up so much of our collective history, the verdict on ‘human nature’ is clear and unequivocal. With the evidence piled so precipitously against us as a species, we ought to plead guilty in the hope of leniency. However, and even though at first glance the case does indeed appear an open-and-shut one, this is not a full account of human nature. There is also the better half to being human, although our virtues are undoubtedly harder to appraise than our faults.

Firstly, we must deal with what might be called ‘the calculus of goodness’. I’ve already hinted at this but let me now be more explicit: Whenever a person is kind and considerate, the problem with ‘the calculus’ is how those acts of kindness are to be counted against prior acts of indifference or malevolence? Or to broaden this: how is any number of saints to make up for the actions of so many devils? Can the accumulation of lesser acts of everyday kindness in aggregation, ever fully compensate for a single instance of rape, torture or cold-blooded murder? Or, to raise the same issue on the larger stage again, how did the smallpox and polio vaccines, which undoubtedly saved a great deal of suffering and the lives of millions, compensate against the bombings of Guernica, Coventry, Dresden, Hiroshima and Nagasaki? For aside from the moral dubiousness of all such utilitarian calculations, the reality is that inflicting harm and causing misery is on the whole so much easier than manufacturing any equivalence of good.

And this imbalance is partly an unfortunate fact of life; a fact that new technologies can and will only exacerbate. So here is a terrible problem that the universe has foisted upon us. For destruction is, as a rule, always a much more likely outcome than creation. It happens all of the time. As things erode, decay, go wonky and simply give up the ghost. If you drop a vase onto a hard floor, then your vase will reliably shatter into a pile of shards, and yet, if you toss those same hundred shards back into the air they will never reform into a vase again. Or, as Creationists like to point out (entirely missing the bigger point that evolution is not a purely random process) no hurricane could ever blow the parts from a scrapyard together again to reform a Jumbo Jet. Destruction then – i.e., the turning of order into chaos – turns out to be the way our universe prefers to unwind. And it’s tough to fight against this.

The random forces of extreme weather, earthquakes, and fires, are inherently destructive, just because they are erratic and haphazard. So if destruction is our wish, the universe bends rather easily to our will; and this is the diabolical asymmetry underlying the human condition.

In short, it will always be far easier to kill a man than to raise a child to become a man. Killing requires nothing else than the sudden slash of a blade, or the momentary pull on a trigger; the sheer randomness of the bullet’s tumbling wound being more than enough to destroy life. As technology advances, the push of a button increases that same potentiality and enables us to flatten entire cities, nations, civilisations. Today we enjoy the means for mega-destruction, and what was unimaginable in Voltaire’s day becomes another option forever “on the table”, in part, as I say, because destruction is an easy opinion, comparatively speaking – comparative to creation, that is.

Nevertheless, our modern weapons of mass destruction have all been willfully conceived, and at great expense in terms both of time and resources, when we might instead have chosen to put such time and resources to a wholly profitable use, protecting ourselves from the hazards of nature, or else thoroughly ridding the world of hunger and disease, or by more generally helping to redress the natural though diabolical asymmetry of life. 52

Here then is a partial explanation for malevolent excesses of human behaviour, although I concede, an ultimately unsatisfactory one. For however easily we are enabled to harm others with soft bodies given that we live in such a world beset by sharp objects and less visible perils, we do nevertheless have the freedom to choose not to do so. To live and let live and to commit ourselves to the Golden Rule that we “do unto others as we would have others do unto us”. So my principle objection to any wholesale condemnation of our species will have little to do with the estranging and intractable universal laws of nature, however harshly those laws may punish our human condition; instead, it entails a defence founded on anthropocentric considerations.

For if human nature is indeed so fundamentally rotten, then what ought we to make of our indisputable virtues? Of friendship and love; to select a pair of shining examples. And what of the great social reformers and the peacemakers like Gandhi and Martin Luther King? What too of our most beautiful constructions in poetry, art and music? Just what are we to make of this better half to our human nature? And why did human beings formulate the Golden Rule in the first instance?

Of course, even apparent acts of generosity and kindness can, and frequently do have, unspoken selfish motivations, so the most cynical adherents of the ‘dark soul hypothesis’ go further again, reaching the conclusion that all human action is either directly or indirectly self-serving. That friendship, love, poetry and music, along with every act of philanthropy (which literally means “love of man”), are all in one way or another products of the same innate selfishness. According to such surprisingly widespread opinion, even at our finest and most gallant the underlying motivation is always reducible to “you scratch my back…”

Needless to say, all of human behaviour really can, if we choose, be costed in such a one-dimensional utilitarian terms. Every action evaluated on the basis of outcomes and measured in terms of personal gain, whether actual or perceived. Indeed, given the mountains of irrefutable evidence that people are all-too-often greedy, shallow, petty-minded and cruel, it is not irrational to believe that humans are invariably and unalterably out for themselves. It follows that kindness only ever is selfishness dressed up in mischievous disguise, and challenging such cynicism is far from easy and can feel like shouting over a gale. The abrupt answer here is that not all personal gain ought to be judged equivalently. Since even if our every whim were, in some ultimate sense, inseparable from, contingent upon, and determined by self-interest, then who is this “self” in which our interests are so heavily vested?

Does the interest of the self include the wants and needs of our family and friends, or even, in special circumstances, the needs of complete strangers, and if so, then do we still call it ‘selfish’? If we love only because it means we receive love in return, or for the love of God (whatever this means), or simply for the pleasure of loving, and if in every case this is deemed selfish, then by definition all acts have become selfish. The meaning of selfishness is thus reduced to nothing more than “done for the self”, which misses the point entirely that selfishness implies a deficiency in the consideration of others. Thus, if we claim that all human action is born of selfishness, as some do, we basically redefine and reduce the meaning of ‘selfish’.

Having said this, I certainly do not wish, however tempting it may be, to paint a false smile where the mouth is secretly snarling. There is nothing to be usefully gained by naivety or sentimentality when it comes to gauging estimates of human nature. Nonetheless, there is an important reason to make a case in defence of our species, even if our defence must be limited to a few special cases. For if there is nothing at all defensible about ‘human nature’ it is hard to see past a paradox, which goes as follows: if human beings are innately and thus irredeemably bad (in accordance with our own estimation obviously), then how can our societies, with structures that are unavoidably and unalterably human, be anywise superior to the ‘human nature’ that designs them, and thus inherently and unalterably bad also. After all, ex nihilo nihil fit – nothing comes from nothing. This is, if you like, the Hobbesian Paradox. (And I shall return to it shortly.)

*

There have been many occasions when writing this book has felt to me a little like feeling around in the dark. Just what is it that I am so urgently trying to say? That feeling has never been more pronounced than when working on this chapter and the one ensuing. For human nature is a subject that leads into ever more divergent avenues and into deeper and finer complexities. What does it even mean to delve into questions about ‘human nature’? Already this presumes some general innate propensity that exists and provides a common explanation for all human behaviour. But immediately, this apparently simple issue brings forth a shifting maze of complications.

Firstly, there is the vital but unresolved debate over free will as opposed to determinism, which at one level is the oldest and most impenetrable of all philosophical problems. All attempts to address this must already presuppose sound concepts of the nature of Nature and of being. However, once we step down to the next level, as we must, we find no certain answers are provided by our physical sciences, which basically posit determinism from the outset in order to proceed.

Then there is a related issue of whether as biological organisms, humans are predominantly shaped by ‘nature or nurture’. In fact, it has become increasingly clear that the question itself is subtly altering, since it becomes evident that the dichotomy is a false one. What can be said with certainty is that inherited traits are encouraged, amplified, altered and sometimes prohibited by virtue of our environment due to processes occurring both at biological and social levels. Beyond this, nature and nurture cannot be so easily disentangled.

The tree grows and develops in accordance not merely with biochemical instructions encoded within its seed but in response to the place where that seed germinates, whether under full sunlight or deep shade, whether its roots penetrate rich or impoverished soil, and in accordance with temporal variations in wind and rainfall. We too are shaped not only as the flukes of genealogy, but by adapting moment by moment to environmental changes from the very instant our father’s sperm penetrated and merged with our mother’s egg. We are no more reducible to Dawkins’ ‘lumbering robots’, those vehicles “blindly programmed to preserve the selfish molecules known as genes” 53 that bloodlessly echo Hobbes, than we are to the ‘tabula rasa’ of Aristotle, Locke, Rousseau and Sartre. Yet somehow this argument lurches on, at least in the public consciousness, always demanding some kind of binary answer as though this remains a possibility.

As for the question of free will or determinism at a cosmic level, my personal belief is the one already presented in the book’s introduction, although to make matters absolutely unequivocal allow me to proffer my equivalent to Pascal’s famous wager: that one ought to live without hesitation as though free will exists, because in the case you are right, you gain everything, whereas if you lose, you lose nothing. Moreover, the view that we are without agency and altogether incapable of shaping our future involves a shallow pretence that also seeks to deny personal responsibility; it robs us of our dignity and self-respect, and disowns the god that dwells within.

As for proof of this faculty, I have none, and the best supporting evidence is that on occasions when I have most compellingly perceived myself as a thoroughly free agent in the world, there has spontaneously arisen a corresponding anxiety: the sense that given one’s possession of such an extravagant gift involves the acknowledgment of the sheer enormity of one’s responsibility. An overwhelming feeling that freedom comes with an excessively heavy price attached.

Indeed, my preferred interpretation of the myth of Eve’s temptation in the Garden of Eden follows from this: that the eating of “the apple” – i.e., the fruit of the tree of the knowledge of good and evil – miraculously and instantly gave birth to free will and conscience as one, with each sustaining the other (like the other snake, Ouroboros, perpetually eating its own tail). It follows that The Fall is nothing besides our human awakening to the contradistinction of good and evil actions, and thus interpreted, this apprehension of morality is simply the contingent upshot of becoming free in a fully conscious sense. 54

Indeed, we might justifiably wonder upon what grounds the most dismal critiques of human nature are founded, if not for the prior existence of a full awareness of moral failings that is itself another component aspect and expression of that same nature. Or, as French writer La Rochefoucauld put it in one of his most famous and eloquent maxims: “Hypocrisy is the homage which vice renders to virtue.” 55 That is, whenever the hypocrite says one thing then does another, he does it because he recognises his own iniquity but then feigns a moral conscience to hide his shame. Less succinctly, it might be restated that acting with good conscience is hard-wired and for most people (sociopaths presumably excluded) doing otherwise automatically involves us in compensatory acts of dissemblance, denial and in self-delusion also.

We have no reason to say humans are wholly exceptional in possessing a conscience, of course, although it seems that we are uncommonly sensitive when it comes to detecting injustice, and the reason is perhaps because (admittedly, this a hunch) we are uniquely gifted empathisers. Unfortunately, such prodigious talent for getting into the minds of others is one that also makes our species uniquely dangerous.

James Hillman was an American psychologist, who studied at, and then guided studies for, the C.G. Jung Institute in Zurich. In the following interview he speaks about how we have lost our connection to the cosmos and consequently our feelings for the beauty in the world and with it our love for life:

*

The Enlightenment struck many blows, one of which effectively killed God (or at least certain kinds of Theism). In the process, it more inadvertently toppled the pedestal upon which humanity had earlier placed itself, as Darwinianism slowly but inevitably brought us all back down to earth with a bump. No longer the lords of creations, still the shibboleth of anthropocentrism is much harder to shake.

Hobbes convinced us that ‘human nature’ is dangerous because it is Nature. Rousseau then took the opposing view arguing that our real problems actually stem from not behaving naturally enough. His famous declaration that “Man is born free, and everywhere he is in chains” forms the opening sentence of his seminal work The Social Contract; the spark that had helped to ignite revolutions across Europe. 56 Less than a century later, Marx and Engels concluded The Communist Manifesto, echoing Rousseau with the no less famous imperative often paraphrased: “Workers of the world unite! You have nothing to lose but your chains” 57

In the place of freedom and perhaps out of a desperate sense of loss, we soon recreated ourselves as gods instead and then set about constructing new pedestals based on fascist and Soviet designs. But finally, the truth was out. Humans make terrible gods. And as we tore down the past, remembering in horror the death camps and the gulags, we also invented new stories about ourselves.

In the process, the post-Hobbesian myth of ‘human nature’ took another stride. Rather than being on a level with the rest of creation and mechanically compelled to lust for power and material sustenance like all animals, our species was recast once again as sui generis in a different way. Beyond the ability to wield tools, and to manipulate the world through language and indeed by virtue of culture more generally, we came to the conclusion that the one truly exceptional feature of humans – the really big thing that differentiates ‘human nature’ from the whole of the rest of nature – was our species outstanding tendency to be rapacious and cruel. Thanks to our peculiar desire for self-aggrandisement, this has become the latest way we flatter ourselves.

It is sometimes said, for instance, that humans are the only creatures that take amusement from in cruelty. Indeed, at first glance this sounds like a perfectly fair accusation, but then just a little consideration finds it to be false. Take the example of the well-fed cat that is stalking the bird: does it not find amusement of a feline kind in its hunt?  When it toys with a cornered mouse, meting out a slow death from the multiple blows of its retractable claws, is it not enjoying itself? And what other reason can explain why that killer whales will often toss a baby seal from mouth to mouth – shouldn’t they just put it out of its misery?

Ah yes, comes the rejoinder, but still we are the only creatures to engage in full-scale warfare. Well, again, yes and no. The social insects go to war too. Chemical weapons are deployed as one colony defends itself from the raids of an aggressor. When this is granted, here’s the next comeback: ah, but we bring malice aforethought. The social insects are merely acting in response to chemical stimuli. They have pheromones for war, but no savage intent.

This brings us a little closer to home – too close perhaps – since it is well documented that chimpanzees gang up to fight against a rival neighbouring troop. How is this to be differentiated from our own outbreaks of tribal and sectarian violence?

That chimpanzees are capable of malice aforethought has long been known too. Indeed, they have observed on occasions to bring a weapon to the scene of the attack. But then, you might expect our immediate evolutionary cousins to share a few of our vices! However, in the 1970s, primatologist Jane Goodall was still more dismayed when she saw how the wild chimps she was studying literally descended into a kind of civil war: systematically killing a group of ‘separatists’ one-by-one and apparently planning their campaign in advance. 57a So yes, without any doubt, humans are best able of all creatures to act with malice aforethought, yet even in this we are apparently not alone.

Okay then… and here is the current fashion in humanity’s self-abasement… we are the only creatures that deliberately destroy their own environment. But again, what does this really mean? When rabbits first landed in Australia (admitted introduced by humans), did they settle down for a fair share of what was available? When domestic cats first appeared in New Zealand (and sorry to pick on cats again), did they negotiate terms with the flightless birds? And what of the crown of thorns starfish that devours the coral reefs, or of the voracious Humboldt squid swarming in some parts of our oceans and consuming every living thing in sight? Or consider this: when the continents of North and South America first collided and a land bridge allowed the Old World creatures of the North to encounter the New World creatures of the South, the migration of the former caused mass extinction of the latter. The Old World creatures being better adapted to the new circumstances simply ate the competition. There was not a man in sight.

In short, Nature’s balance is not maintained thanks to the generosity and co-operation between species: this is a human conceit. Her ways are all-too often cruel. Foxes eat rabbits and in consequence their populations grow and shrink reciprocally. Where there is an abundance of prey the predators thrive, but once numbers reach a critical point that feast becomes a famine, which restores the original balance. This is how ‘Nature’s balance’ is usually maintained – just as Malthus correctly describes (more below). But modern humans have escaped this desperate battle for survival, and by means of clever artificial methods, enable our own populations to avoid both predation and famine; an unprecedented situation that really does finally set us apart from all of our fellow species.

*

When Donald, son of psychologists, Winthrop and Luella Kellogg, turned ten-months old, his parents took the extraordinary decision of adopting Gua, a seven and a half-month female chimp to bring up in their home as a surrogate sibling. It was the 1930s and this would be a pioneering experiment in primate behaviour; a comparative study that caused some deal of dismay in academia and amongst the public. But irrespective of questions of ethics and oblivious to charges of sensationalism, the Kelloggs proceeded and Donald and Gua finally lived together for nine months.

They soon developed a close bond. Although younger, Gua was actually more mature than Donald both intellectually and emotionally. Being protective, she would often hug him to cheer him up. Her development was remarkably swift, and she quickly learned how to eat with a spoon and to drink from a glass. She also learned to walk and to skip – obviously not natural behaviours for a chimp – as well as to comprehend basic words; all of this before Donald had caught up.

This comparative developmental study had to be cut short, however, because by the age of two, Donald’s behaviour was becoming disconcertingly apelike. For one thing, he was regressing back to crawling. He had also learned to carry things in his mouth, picking up crumbs with his lips and one day chewing up a shoe, and far more than ordinary toddlers, he took delight in climbing the furniture and trees. Worse still, his language skills were seriously delayed and by eighteen-months he knew just three words, so that instead of talking he would frequently just grunt or make chimp-like gesticulations instead. The story ends tragically, of course, as all of the concerns over ethics became confirmed. Gua died of pneumonia less than a year after the study was curtailed and she had been abandoned by the Kelloggs family. Donald committed suicide later in life when he was 43 years old.

This is a sad story and by retelling it I am in no way endorsing the treatment of Donald and Gua. No such experiment should ever have been conducted, but it was, and the results are absolutely startling nonetheless. Instead of “humanizing the ape”, as the Kelloggs hoped to achieve, the reverse had been occurring. What they had proved inadvertently is that humans are simply more malleable than chimps, or for that matter any other creature on earth. It is humans that learn best by aping and not the other way around.

*

However much we may try to refine our search for answers, it is actual difficult to get beyond the most rudimentary formulation which ponders upon whether ‘human nature’ is for the most part good or bad. Rephrased, as it often is, this same inquiry generally receives one of four responses that can be summarised as follows: –

i) that human nature is mostly good but corruptible;

ii) that human nature is mostly bad but can be corrected;

iii) that human nature is mostly bad but with flaws that can be ameliorated – rather than made good; or,

iv) most misanthropically, that human nature is atrocious, and irredeemably so, but that’s life.

The first is the Romanticism of Rousseau, whereas the third and fourth hinge around the cynicism of Hobbes. Whereas Hobbes had regarded the ‘state of nature’ as the ultimate threat, Rousseau implores us instead to return to a primitive state of authentic innocence. And it is these extremes of Hobbes and Rousseau that still prevail, informing the nuclear-armed policy of Mutual Assured Destruction on the one hand, and the counterculture of The New Age on the other. Curiously, both peer back distantly to Eden and reassess The Fall from different vantages too. Although deeply unreligious, Hobbes holds the more strictly Christian orthodox view. As undertaker and poet Thomas Lynch laid it out:

[T]he facts of the matter of human nature – we want, we hurt and hunger, we thirst and crave, we weep and laugh, dance and desire more and more and more. We only do these things because we die. We only die because we do these things. The fruit of the tree in the middle of Eden, being forbidden, is sexy and tempting, tasty and fatal.

The fall of Man and Free Market Capitalism, no less the doctrines of Redemptive Suffering and Supply and Demand are based on the notion that enough is never enough… A world of carnal bounty and commercial indifference, where men and women have no private parts, nor shame nor guilt nor fear of death, would never evolve into a place that Darwin and Bill Gates and the Dalai Lama could be proud of. They bit the apple and were banished from it. 58

Forever in the grip of the passions, our ‘appetites’ and ‘aversions’, these conjoined and irrepressible Hobbesian forces of attraction and repulsion continually incite us. In our desperation to escape we flee blindly from our fears, yet remaining hopeful always of entirely satisfying our desires. It’s pain and pleasure all the way: sex and death! And I imagine if you had asked Hobbes whether without the apple “we’d still be blissfully wandering about naked in paradise”, as Dudley Moore put it to Peter Cook’s Devil in the marvelous Faustian spoof Bedazzled, you’d very likely get a similar reply to the one Cook gave him: “they [Adam and Eve] were pig ignorant!” 59

However, the Genesis myth although a short story, in fact takes place as two very distinct acts: and only the first part is concerned with temptation, whereas the denouement is centred on shame. So let’s consider shame for a moment, because shame appears to be unique as an emotion, and though we habitually confuse it with guilt – since both are involved in reactions to conscience – shame has an inescapable social quality. To summarise this, guilt involves what you do, while shame is intrinsically bound up with your sense of self. So guilt leads us to make apologies, a healthy response for wrongdoing, whereas you cannot apologise for being bad.

adam and eve expulsion from eden

Detail from ‘The Expulsion from the Garden of Eden’ (Italian: Cacciata dei progenitori dall’Eden), a fresco by the Italian Early Renaissance artist Masaccio, ca. 1427. Based on image from Wikimedia Commons.

*

The American academic Brené Brown describes shame as “the intensely painful feeling or experience of believing that we are flawed and therefore unworthy of love and belonging” 60 and says imagine how you would feel if you were in a room with all the people you most loved but when you walked out you began to hear the worst things imaginable about yourself; so bad that you don’t think you’ll ever be able to walk back into the room to face everyone again.

In fact, shame is ultimately tied up with fears of being unworthy, unloveable, and of abandonment that we learn to feel as infants, when isolation and rejection are actual existential threats. So it triggers instinctual responses that humans probably evolved in order to avoid being rejected and ostracised by the group, when this again involved an actual existential threat. Shame is an overwhelming feeling accompanied by lots of physiological sensations such as blushing, the tightening of the chest, feelings of not being able to breathe, and a horrible doubt that also runs to the pit in your stomach. It is really no exaggeration to say that shame feels like death. While guilt leads us to make apologies, which is a healthy response for wrongdoing, you cannot usefully apologise just for straightforwardly being bad.

Moreover, and unlike our other emotions, shame can be a response to just about anything: our appearance, our own attention-seeking, when we get too boisterous, too over-excited, talking too much (especially about oneself); or when we retreat into isolation, feeling shy and avoidant; or feeling inauthentic, fake; or for being taken advantage of; or conversely being unable to drop our armour, and being judgmental and quick to anger; or just for a lack of ability, skills, or creativity; our failure to communicate properly, including being able to speak up or speak honestly; or when we are lazy, or weak, with low energy or lack of motivation, perhaps sexually; or finally – not that my list is in anyway exhaustive – shame can be triggered by anxiety, nervousness, defensiveness, when we display our weakness by blushing or showing other visual signs of nervousness or shame. Note the circularity.

Strangely, we can even feel shame without recognising the symptoms, and this may again generate escalating confusion and a terrifying sense of spiralling: a fear that we won’t survive the feeling itself. In fact, shame and fear have a co-existent relationship such that we can alternate between both, and both may leave terrible psychological scars; some of parts of us becoming repressed; others forming a mask – becoming conscious and unconscious aspects (a topic I return to consider in the next chapter).

Interestingly, Jean-Paul Sartre is often paraphrased saying “hell is other people”, which is then widely misinterpreted to mean that our relationships with others are invariably poisoned. In fact, what Sartre had meant is closer to the idea that hell is the judgment of our own existence in the eyes of other people, so then again, perhaps what he finally intended to say is “hell is our sense of rejection in the eyes of others”. If so, then he was surely right. 61

Seen in this way, the Rousseauian standpoint becomes intriguing. Is it possible that the root cause of all human depravity is finally shame? And if we could get beyond our shame, would this return to innocence throw open the gates to paradise once more?

In this chapter I have already tried to expose some of the chinks in our rather well-worn armour of Hobbesianism, because for the reasons expounded upon above, it has been collectively weighing us down. Hobbes’ adamancy that human nature is rotten to the core with its corollary that there is little that can be done about it, is actually rather difficult to refute; the measure of human cruelty vastly exceeding all real or apparent acts of generosity and kindness. But Hobbes’ account is lacking and what it lacks in abundance is any kind of empathy. Our capacity for empathy is, Brené Brown points out, obstructed primarily by shame. Why? Because empathy can only flourish where there is vulnerability and this is precisely what shame crushes.

So yes, we must concede that the little boy who pulls the legs off flies greatly amuses himself. There can be a thrill to malice, if of a rather shallow and sordid kind. But more happiness is frequently to found in acts of creation than in destruction; more fulfillment to helping than hindering; and there is far more comfort in loving than in hating. Even Hobbes, though ‘twinned with fear’, deep down must have known this too.

Brené Brown has spent many decades researching shame, which she believes is an unspoken epidemic and the secret behind many forms of disruptive behaviour. An earlier TED talk on vulnerability became a viral hit. Here she explores what can happen when people confront their shame head-on:

*

On the whole, we are not very much into the essence of things these days. Essentialism is out and various forms of relativism are greatly in vogue. That goes for all things except perhaps our ‘human nature’, for which such an essence is very commonly presumed. Yet it seems to me that the closer one peers, the blurrier any picture of our human nature actually becomes; and the harder one tries to grasp its essence, the less tangible it is. In any case, each of the various philosophies that inform our modern ideas of ‘human nature’ are intrinsically tainted by prior, and in general, hidden assumptions, which arise from vestigial religious and/or political dogma.

For instance, if we take our cue from Science (most especially from Natural History and Biology) by seeking answers in the light of Darwin’s discoveries, then we automatically inherit a view of human nature sketched out by Malthus and Hobbes. Malthus who proceeded directly from (his own version of) God at the outset, and Hobbes, who in desperately trying to circumvent the divine, finished up constructing an entire political philosophy based on a notion barely distinguishable from Augustine’s doctrine of Original Sin. Meanwhile almost all of the histories that commonly inform our opinions about human nature are those written about and in service of the battle-hardened conquerors of empires.

But why suppose that there really is anything deserving the title ‘human nature’ in the first place, especially given what is most assuredly known about our odd species: that we are supremely adaptable and very much more malleable and less instinctive than all our fellow creatures. Indeed the composite words strike me as rather curious, once I can step back a little. After all, ‘human’ and ‘nature’ are not in general very comfortable bedfellows. ‘Human’ meaning ‘artificial’ and ‘nature’ meaning, well… ‘natural’… and bursting with wholesome goodness! Or else, alternatively, ‘human’ translating as humane and civilised, leaving ‘nature’ to supply synonyms for wild, primitive and untamed… and, by virtue of this, red in tooth and claw.

In short, the very term ‘human nature’ is surely an oxymoron, doubly so as we see above. The falsehood of ‘human nature’ concealing the more fascinating if unsettling truth that in so many respects humans conjure up their nature in accordance with how we believe ourselves to be, which rests in turn on what limits are set by our family, our acquaintances and the wider culture. Human nature and human culture are inextricable, giving birth to one another like the paradoxical chicken and egg. As Huxley writes:

‘Existence is prior to essence.’ Unlike most metaphysical propositions, this slogan of the existentialists can actually be verified. ‘Wolf children,’ adopted by animal mothers and brought up in animal surroundings, have the form of human beings, but are not human. The essence of humanity, it is evident, is not something we are born with; it is something we make or grow into. We learn to speak, we accumulate conceptualized knowledge and pseudo-knowledge, we imitate our elders, we build up fixed patterns of thought and feeling and behaviour, and in the process we become human, we turn into persons. 62

Alternatively, we might give a nod to Aristotle who famously declared “man is by nature a political animal”, an assessment seemingly bound up in contradictions while yet abundantly true, and which he then expounds upon saying:

“And why man is a political animal in a greater measure than any bee or any gregarious animal is clear. For nature, as we declare, does nothing without purpose; and man alone of the animals possesses speech. The mere voice, it is true, can indicate pain and pleasure, and therefore is possessed by the other animals as well (for their nature has been developed so far as to have sensations of what is painful and pleasant and to indicate those sensations to one another), but speech is designed to indicate the advantageous and the harmful, and therefore also the right and the wrong; for it is the special property of man in distinction from the other animals that he alone has perception of good and bad and right and wrong and the other moral qualities, and it is partnership in these things that makes a household and a city-state.” 63

Two millennia later and half a millennium after the Aristotelian star had finally waned, Benjamin Disraeli reflected on the latest developments in science and specifically the new theory of evolution, saying:

“The question is this— Is man an ape or an angel? My Lord, I am on the side of the angels.” 63a

To end, therefore, I propose a secular update to Pascal’s wager, which goes as follows: if, and in direct contradiction to Hobbes, we trust in our ‘human nature’ and promote its more virtuous side, then we stand to gain amply in the circumstance that we are right to do so and at little cost, for if it turns out we were mistaken and ‘human nature’ is indeed intrinsically rotten to our bestial cores, our lot as a species is inescapably dreadful whatever we wish to achieve. For in the long run, as new technologies supply ever more creative potential for cruelty and destruction (including self-annihilation), what chance do we have to survive at all if we are so unwilling to place just a little trust in ourselves to do a whole lot better?

Next chapter…

*

Addendum: the Malthusian population bomb scare

Thomas Malthus was a man of many talents. A student of Cambridge University, where he had excelled in English, Latin, Greek and Mathematics, he later became a Professor of History and Political Economy and a Fellow of the Royal Society. There is, however, chiefly one subject above all others that Malthus remains closely associated with, and that is the subject of demography – human populations – a rather single-minded preoccupation that during his tenure as professor is supposed to have earned him the nickname “Pop” Malthus.

Malthus big idea was precisely this: that whereas human population increases geometrically, food production, upon which the growing population inevitably depends, can only increase in an arithmetic fashion. He outlines his position as follows:

I think I may fairly make two postulata. First, That food is necessary to the existence of man. Secondly, That the passion between the sexes is necessary and will remain nearly in its present state. These two laws, ever since we have had any knowledge of mankind, appear to have been fixed laws of our nature, and, as we have not hitherto seen any alteration in them, we have no right to conclude that they will ever cease to be what they now are… 64

Given that populations always grow exponentially whereas food production must inevitably be arithmetically limited, Malthus concludes that the depressing, but unassailable consequence is a final limit not simply to human population but to human progress and “the perfectibility of the mass of mankind”:

This natural inequality of the two powers of population and of production in the earth, and that great law of our nature which must constantly keep their effects equal, form the great difficulty that to me appears insurmountable in the way to the perfectibility of society. All other arguments are of slight and subordinate consideration in comparison of this. I see no way by which man can escape from the weight of this law which pervades all animated nature. No fancied equality, no agrarian regulations in their utmost extent, could remove the pressure of it even for a single century. And it appears, therefore, to be decisive against the possible existence of a society, all the members of which should live in ease, happiness, and comparative leisure; and feel no anxiety about providing the means of subsistence for themselves and families. 65

It’s a truly grim message, although in fairness to Malthus, the gloom is delivered in a lively and frequently entertaining style. That said, however, Malthus was wrong. Terribly wrong.

Firstly, he was wrong in terms of specifics, since he wildly over-estimated the rate of population growth 66, thereby exaggerating the number of future mouths needing to fed and, by extension, the amount of food needed to fill them. Obviously what Malthus was lacking here was actual available statistics, and it is perhaps not surprising therefore, that he later became one of the founder members of the Statistical Society in London 67: the first organisation in Britain dedicated to the collection and collation of national statistics. Charles Babbage, who is nowadays best remembered as the inventor of early calculating machines, known as “difference engines” – machines that helped to lead the way to modern computing – was another founder member of the group, and obviously took statistics very seriously indeed. He even once corrected the poet Alfred Tennyson in a letter as follows:

In your otherwise beautiful poem, one verse reads, ‘Every moment dies a man,/ Every moment one is born’: I need hardly point out to you that this calculation would tend to keep the sum total of the world’s population in a state of perpetual equipoise whereas it is a well-known fact that the said sum total is constantly on the increase. I would therefore take the liberty of suggesting that in the next edition of your excellent poem the erroneous calculation to which I refer should be corrected as follows: ‘Every moment dies a man / And one and a sixteenth is born.’ I may add that the exact figures are 1.167, but something must, of course, be conceded to the laws of metre. 68

It may be noted then, that such a rate of increase (presumably based on real statistics), although still exponential, is far below the presumed rates of growth in Malthus’s essay. But then Malthus’s estimate may be fairly excused; his famous essay having been first published about four decades before any statistics would have been available. Malthus was, however, also more fundamentally wrong in his thesis; for such catastrophic oscillations as he envisaged through cycles of overpopulation and famine are not the order of our times, and less so now than even during his own times of relatively small populations. In fact contrary to Malthus’ prophesies of doom, we have a great plenty of food to go around (lacking merely the political and economic will to distribute it fairly) 69, with official UN estimates indicating that we shall continue to have such abundance for the foreseeable future. 70

*

I can still recall when, as a sixth-former, I’d first heard about Malthus’ theory of population, and how it had sounded like altogether the daftest, most simplistic theory I’d ever come across – an opinion that remained for at least a few months before I’d heard about Abraham Maslow’s “hierarchy of needs” which I then considered still dafter and more simplistic again. In both cases, it was clear to me that supposition and conjecture is being presented as quasi-scientific fact. In Maslow’s case, with his hierarchical stacking of physical and psychological needs, it was also self-evident that no such ascending pyramid really existed anywhere outside of Maslow’s own imaginings. That you might just as well construct a dodecahedron of pleasures, or a chocolate cheesecake of motivational aspirations, as make-up any kind of pyramid of human needs.

I was judging his ideas unfairly, however, and in hindsight see I was prejudiced by my scientific training. As a student of Physics, Chemistry and Mathematics, I’d become accustomed to rigorously grounded theories in which predictions can and must be made and tested against actual data. But Maslow’s theory is not a theory of this kind. It is inherently nonrigorous, and yet it may still be valuable in another way. As a psychologist he had diverged from the contemporary practice of expanding the field purely on the basis of neuroses and complexes, and he sought instead, a more humanistic approach to analysing what he thought constituted healthy-mindedness. His main concern was how people might achieve “self actualization”. So his ‘theory’ is better understood and judged within this context, and the same goes for other nonrigorous formulations. 71

With Malthus, however, my irritation was coloured differently. His theory may have been simply an educated and carefully considered hunch, but it did at least present us with outcomes that could be scientifically reviewed. Plainly, however, all the available facts confounded his case absolutely.

After all, it had been two centuries since Malthus first conjectured on the imminence of food shortages, yet here we were, hurtling towards the end of the twentieth century, still putting too many leftovers in our bins. And though people living in the third world (as it was then called) were desperately poor and undernourished – as remains the case – this was already the consequence of our adopted modes of distribution rather than any consequence of insufficient production of food as such. Indeed, as a member of the EEC, the United Kingdom was responsible for its part in the storage of vast quantities of food and drink that would never be consumed: the enormous ‘mountains of cheese’ and the ‘lakes of milk and wine’ being such prominent features of the politico-economic landscape of my adolescence.

So where precisely did Malthus go wrong? In fact, both of his purportedly axiomatic postulates are unfounded. Regarding food production being an arithmetic progression, he completely failed to factor in the staggering ingenuity of human beings. He seems curiously oblivious to how, even at the turn of the nineteenth century when his essay was written, food production was already undergoing some dramatic technological shifts, including methods of selective breeding, and with the advent of mechanised farming equipment. The more recent developments of artificial fertilisers and pesticides have enabled cultivation of far greater acreage, with crop yields boosted far in excess of any arithmetic restriction. With the latest “green technologies” permitting genetic manipulation, the amounts of food we are able to produce might be vastly increased again, if this is what we should chose to do – and I do not say that we should automatically resort to such radical and potentially hazardous new technologies, only that there are potential options to forestall our supposed Malthusian fate.

Meanwhile, on the other side of Malthus’s inequality, we see that his estimates of rates of population growth were wrong for different but perhaps related reasons. Again, he underestimates our adaptive capability as a species, but here the error is born out of an underlying presumption; one that brings me right back to the question of ‘human nature’.

*

Perhaps the most interesting and intriguing part of Malthus’ famous essay are not the accounts of his discredited formulas that illustrate the mismatch between population growth and food production, but the concluding pages. Here are chapters not about geometric and arithmetic progressions, nor of selected histories to convince us of the reality of our predicament, nor even of the various criticisms of progressive thinkers who he is at pains to challenge – no, by far the most interesting part (in my humble opinion) are the final chapters where he enters into discussion of his real specialism, which was theology. For Reverend Malthus was first and foremost a man of the cloth, and it turns out that his supposed axiomatic propositions have actually arisen from his thoughts about the nature of God, of Man, of the Mind, and of Matter and Spirit. 72, 73

In short, Malthus argues here that God fills us with needs and wants in order to stimulate action and develop our minds; necessity being such a constant and reliable mother of invention. And Malthus draws support from the enlightenment philosophy of empiricist and humanist John Locke:

If Locke’s idea be just, and there is great reason to think that it is, evil seems to be necessary to create exertion, and exertion seems evidently necessary to create mind.” This given, it must follow, Malthus says, that the hardships of labour required for survival are “necessary to the enjoyment and blessings of life, in order to rouse man into action, and form his mind to reason. 74

Whilst adding further that:

The sorrows and distresses of life form another class of excitements, which seem to be necessary, by a peculiar train of impressions, to soften and humanize the heart, to awaken social sympathy, to generate all the Christian virtues, and to afford scope for the ample exertion of benevolence.

The perennial theological “problem of evil” is thus surmountable, Malthus says, if one accepts “the infinite variety of forms and operations of nature”, since “evil exists in the world not to create despair, but activity.” In other words, these things are sent to try us, or rather, because Malthus is very keen to distance himself from more traditional Christian notions of reward and punishment, “not for the trial, but for the creation and formation of mind”. Without pain and distress there would be no pricks to kick against, and thus no cause to perfect ourselves. This, at least, is Malthus’ contention.

In this he echoes a theodicy already well developed by one of the true Enlightenment geniuses, Gottfried Wilhelm Leibniz. Best remembered now as the independent discoverer of calculus, unaware of Newton’s parallel development, Leibniz also left us an astonishing intellectual legacy with published articles on almost every subject including politics, law, history and philosophy. In a collection of essays from 1710, and in making his own case for the goodness of God, it was Leibniz who first described our world as “the best of all possible worlds”. 75

Famously, Voltaire stole Leibniz’s aphorism and, by reworking it into the central motif of his marvellous satire Candide (written 1759), invested it with characteristically biting irony. In Candide’s adventures, Voltaire turns the phrase into the favourite maxim and motto of his learned companion and teacher Dr Pangloss. The Panglossian faith an unimpeachable acceptance of the divine and cosmic beneficence to be maintained in spite of every horror and irrespective of all disasters they witness and that befall them. Shipwrecks, summary executions, and even being tortured by the Inquistion; all is justifiable in this best of all possible worlds. For Malthus, although writing half a decade after Voltaire’s no-nonsense lampooning, an underpinning belief in a world that was indeed “the best of all possible worlds” remained central to his thesis; Malthus even declaring with Panglossian optimism that:

… we have every reason to think that there is no more evil in the world than what is absolutely necessary as one of the ingredients in the mighty process [of Life]. 76

So what does all of this mean for Malthus’s God? Well, God is mysterious and ultimately unfathomable, because “infinite power is so vast and incomprehensible an idea that the mind of man must necessarily be bewildered in the contemplation of it.” This accepted, Malthus then argues that we do have clues, however, for understanding God through objective analysis of his handiwork, by “reason[ing] from nature up to nature’s God and not presum[ing] to reason from God to nature.”

Yes, says Malthus, we might fancy up “myriads and myriads of existences, all free from pain and imperfection, all eminent in goodness and wisdom, all capable of the highest enjoyments, and unnumbered as the points throughout infinite space”, but these are “crude and puerile conceptions” born of the inevitable and unassailable ignorance and bewilderment we have before God. Far better then, to:

“… turn our eyes to the book of nature, where alone we can read God as he is, [to] see a constant succession of sentient beings, rising apparently from so many specks of matter, going through a long and sometimes painful process in this world, but many of them attaining, ere the termination of it, such high qualities and powers as seem to indicate their fitness for some superior state. Ought we not then to correct our crude and puerile ideas of infinite Power from the contemplation of what we actually see existing? Can we judge of the Creator but from his creation?”

So God, at least according to Rev. Malthus, is to be understood directly through Nature – an idea that is bordering on the heretical. But what of the Principle of Population? How does this actually follow from the Malthusian “God of nature” 77 ?

Here we must remind ourselves again that what nowadays are sometimes called our instinctual drives, and what Malthus describes as “those stimulants to exertion which arise from the wants of the body”, are to Malthus but necessary evils. They are evils but with a divine purpose, and this purpose alone justifies their existence. In particular, those wants of the body which Malthus coyly refers to as “the passion between the sexes” are, in this scheme, the necessary means for the human race to perpetuate itself. With sex directly equated to procreation.

On the face of it then, Malthus must have been entirely ignorant of the sorts of sexual practices that can never issue progeny. (To rework a line from Henry Ford) sex might be any flavour you like, so long as it is vanilla! More likely, however, he dismissed any such ‘contraceptive’ options not because of ignorance but on the grounds of his deep-seated Christian morality. Rum and the lash, in moderation possibly, but sodomy… we are British!

If Malthus could be brought forward to see the western world today, what he’d find would doubtless be a tremendous shock in many ways. Most surprisingly, however, he would discover a culture where ‘the passions’ are endlessly titillated and aroused, and where “the wants of the body” are very easily gratified. Quite aside from the full-frontal culture shock, Malthus would surely be even more astonished to hear that our libidinous western societies have solved his supposedly insoluble population problem; our demographics flattening off, and our numbers in a slow but annual decline.

Malthus had argued very strongly against the poor laws, calling for their eventual abolition. He firmly believed that all kinds of direct intervention only encouraged a lack of moral restraint which was the underlying root to all the problems. He earnestly believed that it would be better to let nature take care of these kinds of social diseases. Yet we can now see that one solution to his population problem has been the very thing he was fighting against. That the populations in our modern societies have stabilised precisely because of our universal social welfare and pension systems: safety nets that freed us all from total reliance upon the support of our children in old age.

We also see that as child mortality has markedly decreased, parents have little reason to raise such large families in the first instance. And that once more people – women especially – won access to a basic education, the personal freedom this affords gave them further opportunity and better reason to plan ahead and settle for smaller families. It is thanks to all of these social changes, combined with the development of the contraceptive pill, that “the passion between the sexes” has been more or less surgically detached from population growth.

Making life tougher, Malthus reasoned, would be the bluntest tool for keeping down the numbers, especially of the lower classes. Yet if he landed on Earth today, he would discover irrefutable proof that the exact opposite is the case. That where nations are poorest, populations are rising fastest. There is much that Malthus presumed to be common sense but that, in fact, turns out to be false. 78

*

Please note that for the purposes of ‘publishing’ here I have taken advantage of the option to incorporate hypertext links and embed videos – in order to distinguish additional commentary from the original text all newly incorporated text has been italised.

*

1 From Prince Hamlet’s monologue to Rosencrantz and Guildenstern in Hamlet Act II, Scene 2. In fuller context:

What a piece of work is a man! How noble in reason, how infinite in faculty! In form and moving how express and admirable! In action how like an angel, in apprehension how like a god! The beauty of the world. The paragon of animals. And yet, to me, what is this quintessence of dust? Man delights not me. No, nor woman neither, though by your smiling you seem to say so.

2  Quote taken from the Introduction to The Naked Ape written by Desmond Morris, published in 1967; Republished in: “The Naked Ape by Desmond Morris,” LIFE, Vol. 63, Nr. 25 (22 Dec. 1967), p. 95.

3 Stanley Kubrick speaking in an interview with Eric Norden for Playboy (September 1968)

4 “It takes all the running you can do, to keep in the same place.”

5 The original script for the 2001 also had an accompanying narration which reads:

“By the year 2001, overpopulation has replaced the problem of starvation but this is ominously offset by the absolute and utter perfection of the weapon.”

“Hundreds of giant bombs had been placed in perpetual orbit above the Earth. They were capable of incinerating the entire earth’s surface from an altitude of 100 miles.”

“Matters were further complicated by the presence of twenty-seven nations in the nuclear club.”

6 From the Stanley Kubrick interview with Playboy magazine (1968). http://dpk.io/kubrick

7 From the chapter on “Generation” from Zoonomia; or the Laws of Organic Life (1994) written by Erasmus Darwin http://www.gutenberg.org/files/15707/15707-h/15707-h.htm#sect_XXXIX

8

In October 1838, that is, fifteen months after I had begun my systematic inquiry, I happened to read for amusement Malthus On Population, and being well prepared to appreciate the struggle for existence which everywhere goes on from long-continued observation of the habits of animals and plants, it at once struck me that under these circumstances favourable variations would tend to be preserved, and unfavourable ones to be destroyed. The results of this would be the formation of a new species. Here, then I had at last got a theory by which to work; but I was so anxious to avoid prejudice, that I determined not for some time to write even the briefest sketch of it.

From Charles Darwin’s autobiography (1876), pp34–35

9 Bellum omnium contra omnes, a Latin phrase meaning “the war of all against all”, is the description that Thomas Hobbes gives to human existence existing in “the state of nature” that he describes in first in De Cive (1642) and  later in Leviathan (1651). The Latin phrase occurs in De Cive:

“… ostendo primo conditionem hominum extra societatem civilem, quam conditionem appellare liceat statum naturæ, aliam non esse quam bellum omnium contra omnes; atque in eo bello jus esse omnibus in omnia.”

“I demonstrate, in the first place, that the state of men without civil society (which state we may properly call the state of nature) is nothing else but a mere war of all against all; and in that war all men have equal right unto all things.”

In chapter XIII of Leviathan, Hobbes more famously expressly the same concept with these words:

Hereby it is manifest that during the time men live without a common Power to keep them all in awe, they are in that condition which is called War; and such a war as is of every man against every man.[…] In such condition there is no place for Industry, because the fruit thereof is uncertain: and consequently no Culture of the Earth; no Navigation, nor use of the commodities that may be imported by Sea; no commodious Building; no Instruments of moving and removing such things as require much force; no Knowledge of the face of the Earth; no account of Time; no Arts; no Letters; no Society; and which is worst of all, continual Fear, and danger of violent death; And the life of man solitary, poor, nasty, brutish, and short.

10 The glee with which my old professor had jokingly dismissed Galileo was undisguised, and he was quick to add that he regarded Galileo’s reputation as greatly inflated. What other physicist, he inquired of us, is remembered only by their first name? With hindsight, I can’t help wondering to what he was alluding? It is mostly kings and saints (and the convergent category of popes) who we find on first-name historical terms. The implication seems to be that Galileo has been canonised as our first secular saint (after Leonardo presumably). Interestingly, and in support of this contention, Galileo’s thumb and middle fingers plus the tooth and a vertebra (removed from his corpse by admirers during the 18th century) have recently been put on display as relics in the Galileo Museum in Florence.

11 Alexander Pope (1688–1744): ‘Epitaph: Intended for Sir Isaac Newton’ (1730)

12 The famous quote comes from letter Newton sent to fellow scientist Robert Hooke, in which about two-thirds of the way down on the first page he says “if I have seen further, it is by standing on the shoulders of giants.” It has been suggested that this remark was actually intended as a snide dig at Hooke, a rival who Newton was continually in dispute with and who was known for being rather short in physical stature.

13 From Il Saggiatore (1623) by Galileo Galilei. In the original Italian the same passage reads:

La filosofia è scritta in questo grandissimo libro, che continuamente ci sta aperto innanzi agli occhi (io dico l’Universo), ma non si può intendere, se prima non il sapere a intender la lingua, e conoscer i caratteri ne quali è scritto. Egli è scritto in lingua matematica, e i caratteri son triangoli, cerchi ed altre figure geometriche, senza i quali mezzi è impossibile intenderne umanamente parola; senza questi è un aggirarsi vanamente per un oscuro labirinto

14

Hobbes and the earl of Devonshire journeyed to Italy late in 1635, remaining in Italy until the spring of 1636 when they made their way back to Paris. During this tour of Italy Hobbes met Galileo, although the dates and details of the meeting are not altogether clear. In a letter to Fulgenzio Micanzio from 1 December, 1635, Galileo reports that “I have had many visits by persons from beyond the alps in the last few days, among them an English Lord who tells me that my unfortunate Dialogueis to be translated into that language, something that can only be considered to my advantage.” The “English Lord” is almost certainly Devonshire, and the projected English translation of the Dialogue is presumably the work of Dr. Joseph Webb mentioned in Hobbes’s February, 1634 letter to Newcastle. It is therefore likely that Hobbes met Galileo in December of 1635, al-though Hobbes was not otherwise known to be in Florence until April of 1636. Aubrey reports that while in Florence Hobbes “contracted a friend-ship with the famous Galileo Galileo, whom he extremely venerated and magnified; and not only as he was a prodigious witt, but for his sweetness of nature and manners”. Legend even has it that a conversation with Galileo in 1635 or 36 inspired Hobbes to pursue the goal of presenting moral and political philosophy in a rigorously geometrical method, although the evidence here is hardly compelling.

From a paper entitled Galileo, Hobbes, and the Book of Nature by Douglas M. Jesseph, published in Perspectives on Science (2004), vol. 12, no. 2 by The Massachusetts Institute of Technology. It is footnoted with the following disqualifier:

The evidence, such as it is, comes from the eighteenth century historian of mathematics Abraham Kästner, who reported “John Albert de Soria, former teacher at the university in Pisa, assures us it is known through oral tradition that when they walked togeteher at the grand-ducal summer palace Poggio Imperiale, Galileo gave Hobbes the first idea of bringing moral philosophy to mathematical certainty by treating it according to the geometrical method”. Schumann dismisses the tale as “certainly false,” basing this judgment on a variety of evidence, including the fact that Soria himself expressed skepticism about the story.

https://watermark.silverchair.com/106361404323119871.pdf?token=AQECAHi208BE49Ooan9kkhW_Ercy7Dm3ZL_9Cf3qfKAc485ysgAAAo4wggKKBgkqhkiG9w0BBwagggJ7MIICdwIBADCCAnAGCSqGSIb3DQEHATAeBglghkgBZQMEAS4wEQQMsyC-rL3cNaA4jxGKAgEQgIICQUv8KppqEobaooIWAp4zOmspRnjPLemQuJPq9SdYMkXz9MdidZukWj-XPLej4xmVxFg9w13iQjQ6vJBkevCQSAHpI7Ltsdr5Cq9OtusB7kZ72Z2ERWX8aW6-6nXgo5VX3pcUKwR8rfd6uRrDRlT-av27Qg3Gr2yE5bitEnOuljPtwnYeI9ZAAwbu6d9Ncg7_W5_StRVBELTJ7QTjzjsM9Dx0B64IGa9o0L7hTPdc6PkOUK23g6D4dCZ2kN2Qn3fEh-Uwkkm_iYO2DrOqUQM_dkkcjpRGJDrSvUrMpOSpVBPh7V2vz8TzaE_8D3300Zm_f8pkiNKBrqPJ1ghe3b7VmfPj9-foW_4rZCNN2SkcosyNg1988UWm155UoesLrh8NZUm3sxgVnkPafBIx7xmHGdcVmxpQHCH-8Ahju5_VvOx-LfSCbkdc1zFG0Qs-jH4ecrL9ESPQGDhRCUwjtnsCuuC8gjM6UFXl9Fd8bzrdTvVukzlOYEleSlWc-mStmEsiGZ85dPSCKMrv3-jYiXk6k5JvtoFQvYquvcN_krLTYLw0tjzlO-b_0zvRzWWVQnrnjNDkkLWFCAKkDqAIK8OhLfafzHfXenkgvjhcV4Ba1XWp0a5Ji8THKrPO1S3Sa65xm_jgTmlPVVJ69Ar2GWAFBveO6DLy79G6KRKFtE-K9908bmblJzHAUqkI1btDuOIcXCbZy2tFnDj1Dk3lcSuUtJrVeUCsGCFynA8AiN16CTvKUZx3XJvdzv6XGyfE-5n_BE0

15

There be in Animals, two sorts of Motions peculiar to them: One called Vital; begun in generation, and continued without interruption through their whole life; such as are the Course of the Blood, the Pulse, the Breathing, the Concoctions, Nutrition, Excretion, &c; to which Motions there needs no help of Imagination: The other in Animal Motion, otherwise called Voluntary Motion; as to Go, to Speak, to Move any of our limbs, in such manner as is first fancied in our minds. That Sense is Motion in the organs and interior parts of man’s body, caused by the action of the things we See, Hear, &c

Quote from, Leviathan (1651), The First Part, Chapter 6, by Thomas Hobbes (with italics and punctuation as in the original but modern spelling). https://www.gutenberg.org/files/3207/3207-h/3207-h.htm#link2H_PART1

16

[A]lthough unstudied men, do not conceive any motion at all to be there, where the thing moved is invisible; or the space it is moved in, is (for the shortness of it) insensible; yet that doth not hinder, but that such Motions are. For let a space be never so little, that which is moved over a greater space, whereof that little one is part, must first be moved over that. These small beginnings of Motion, within the body of Man, before they appear in walking, speaking, striking, and other visible actions, are commonly called ENDEAVOUR.

Ibid.

17

This Endeavour, when it is toward something which causes it, is called APPETITE, or DESIRE; the later, being the general name; and the other, oftentimes restrained to signify the Desire of Food, namely Hunger and Thirst. And when the Endeavour is fromward [i.e., distant from] something, it is generally called AVERSION. These words Appetite, and Aversion we have from the Latin; and they both of them signify the motions, one of approaching, the other of retiring. […]

Of Appetites, and Aversions, some are born with men; as Appetite of food, Appetite of excretion, and exoneration, (which may also and more properly be called Aversions, from somewhat they feel in their Bodies;) and some other Appetites, not many. The rest, which are Appetites of particular things, proceed from Experience, and trial of their effects upon themselves, or other men. For of things we know not at all, or believe not to be, we can have no further Desire, than to taste and try. But Aversion we have for things, not only which we know have hurt us; but also that we do not know whether they will hurt us, or not.

Ibid.

18 Quote from, Leviathan (1651), The First Part, Chapter 8, by Thomas Hobbes (with italics and punctuation as in the original but modern spelling).

19 Ibid.

20 Ibid.

21 S. L. A. Marshall findings were complied in a seminal work titled Men Against Fire (1947).

22

In the aftermath of the Battle of Gettysburg, the Confederate Army was in full retreat, forced to abandon all of its dead and most of its wounded. The Union Army and citizens of Gettysburg had an ugly cleanup task ahead of them. Along with the numerous corpses littered about the battlefield, at least 27,574 rifles (I’ve also seen 37,574 listed) were recovered. Of the recovered weapons, a staggering 24,000 were found to be loaded, either 87% or 63%, depending on which number you accept for the total number of rifles. Of the loaded rifles, 12,000 were loaded more than once and half of these (6,000 total) had been loaded between three and ten times. One poor guy had reloaded his weapon twenty-three times without firing a single shot.

From On Killing: The Psychological Cost of Learning to Kill in War and Society (1996) by Dave Grossman

23 The same passage concludes:

Another doctrine repugnant to Civil Society, is, that “Whatsoever a man does against his Conscience, is Sin;” and it dependeth on the presumption of making himself judge of Good and Evil. For a man’s Conscience, and his Judgement is the same thing; and as the Judgement, so also the Conscience may be erroneous. Therefore, though he that is subject to no Civil Law, sinneth in all he does against his Conscience, because he has no other rule to follow but his own reason; yet it is not so with him that lives in a Common-wealth; because the Law is the public Conscience, by which he hath already undertaken to be guided.

Quote from, Leviathan (1651), The Second Part, Chapter 29, by Thomas Hobbes (with italics and punctuation as in the original but modern spelling).

24 Hobbes had actually tried to found his entire philosophy on mathematics but in characteristically contrarian fashion was also determined to prove that mathematics itself was also reducible to materialistic principles. This meant rejecting an entire tradition that began with Euclid and that continues today and which recognises the foundations of geometry lie in abstractions such as points, lines and surfaces. In response to Hobbes, John Wallis, Oxford University’s Savilian Professor of Geometry and founding member of the Royal Society, had publicly engaged with the “pseudo-geometer” in a dispute that raged from 1655 until Hobbes’s death in 1679. To illustrate the problem with Hobbes various “proofs” of unsolved problems including squaring the circle (all of which were demonstrably incorrect), Wallis had asked rhetorically: “Who ever, before you, defined a point to be a body? Who ever seriously asserted that points have any magnitude?”

You can read more about this debate in a paper published by The Royal Society titled Geometry, religion and politics: context and consequences of the Hobbes–Wallis dispute written by Douglas Jesseph, published October 10, 2018. https://doi.org/10.1098/rsnr.2018.0026

25 Quote from, Leviathan (1651), The First Part, Chapter 5, by Thomas Hobbes (with italics and punctuation as in the original but modern spelling).

26 From The Perils of Obedience  (1974) by Stanley Milgram, published in Harper’s Magazine. Archived from the original on December 16, 2010. Abridged and adapted from Obedience to Authority.

27 Ibid.

28 From The Life of the Robin, Fourth Edition (1965), Chapter 15 “A Digression on Instinct” written by David Lack.

29 From Historia Vitae et Mortis by Sir Francis Bacon (‘History of Life and Death’, 1623).

30 Morphological changes such as albinism and loss of sight are common to all cave-dwelling species including invertebrates, fish and also birds. It is presumed that these changes have come about because they save energy and thus confer an evolutionary advantage although biologists find it difficult to explain loss of pigmentation since there seems to be very little energy saved in this way.

31 From a Tanner Lecture on Human Values entitled Morality and the Social Instincts: Continuity with the Other Primates delivered by Frans B. M. Waal at Princeton University on November 19–20, 2003.

The abstract begins:

The Homo homini lupus [“Man is wolf to man.”] view of our species is recognizable in an influential school of biology, founded by Thomas Henry Huxley, which holds that we are born nasty and selfish. According to this school, it is only with the greatest effort that we can hope to become moral. This view of human nature is discussed here as “Veneer Theory,” meaning that it sees morality as a thin layer barely disguising less noble tendencies. Veneer Theory is contrasted with the idea of Charles Darwin that morality is a natural outgrowth of the social instincts, hence continuous with the sociality of other animals. Veneer Theory is criticized at two levels. First, it suffers from major unanswered theoretical questions. If true, we would need to explain why humans, and humans alone, have broken with their own biology, how such a feat is at all possible, and what motivates humans all over the world to do so. The Darwinian view, in contrast, has seen a steady stream of theoretical advances since the 1960s, developed out of the theories of kin selection and reciprocal altruism, but now reaching into fairness principles, reputation building, and punishment strategies. Second, Veneer Theory remains unsupported by empirical evidence.

https://tannerlectures.utah.edu/_documents/a-to-z/d/deWaal_2005.pdf

32 Quote from a NOVA interview, “The Bonobo in All of UsPBS from January 1, 2007.

33 Quote from a NOVA interview, “The Bonobo in All of UsPBS from January 1, 2007.

35 The second stanza of Wallace Steven’s poem Thirteen Ways of Looking at a Blackbird.

36 As he explained in an interview published in the Royal Society of Biology journal The Biologist Vol 60(1) p16-20. https://www.rsb.org.uk/biologist-interviews/richard-dawkins

37 Extracts taken from Chapter 2, pp 45-48, “Seeing Voices” by Oliver Sacks, first published 1989, Picador.

38 Aldous Huxley in the Foreword of ‘The First and Last Freedom’ by Jiddu Krishnamurti.

In his collection of essays Adonis and the Alphabet (1956), the first chapter titled “The Education of an Amphibian” begins as follows:

Every human being is an amphibian— or, to be more accurate, every human being is five or six amphibians rolled into one. Simultaneously or alternately, we inhabit many different and even incommensurable universes. To begin with, man is an embodied spirit. As such, he finds himself infesting this particular planet, while being free at the same time to explore the whole spaceless, timeless world of universal Mind. This is bad enough; but it is only the beginning of our troubles. For, besides being an embodied spirit, each of us is also a highly self-conscious and self-centred member of a sociable species. We live in and for ourselves; but at the same time we live in and, somewhat reluctantly, for the social group surrounding us. Again, we are both the products of evolution and a race of self-made men. In other words, we are simultaneously the subjects of Nature and the citizens of a strictly human republic, which may be anything from what St Paul called ‘no mean city’ to the most squalid of material and moral slums.

39 Also from the first chapter titled “The Education of an Amphibian” of Aldous Huxley’s collection of essays Adonis and the Alphabet (1956).

39a Quote taken from “Rixty Minutes”, Episode 8, Season 1, of adult cartoon Rick and Morty first broadcast by the Cartoon Network on March 17, 2014.

40 The quote is directly addressed to political philosopher and anarchist Pierre-Joseph Proudhon in Chapter 2: “The Metaphysics of Political Economy”; Part 3: “Competition and Monopoly” of Karl Marx’s The Poverty of Philosophy, a critique of the economic and philosophical doctrine of Proudhon, first published in 1847. In full the quote reads:

“M. Proudhon does not know that all history is nothing but a continuous transformation of human nature.”

https://www.marxists.org/archive/marx/works/1847/poverty-philosophy/

41 Quote taken from Episode 3 of Romer’s Egypt first broadcast on BBC TV in 1982.

42 From Christopher Columbus’s log for Friday, Saturday and Sunday October 12 –14, 1492. https://www.americanjourneys.org/pdf/AJ-062.pdf

43 The following are separate entries:

“With my own eyes I saw Spaniards cut off the nose and ears of Indians, male and female, without provocation, merely because it pleased them to do it. …Likewise, I saw how they summoned the caciques and the chief rulers to come, assuring them safety, and when they peacefully came, they were taken captive and burned.”

“They laid bets as to who, with one stroke of the sword, could split a man in two or could cut off his head or spill out his entrails with a single stroke of the pike.”

“They took infants from their mothers’ breasts, snatching them by the legs and pitching them headfirst against the crags or snatched them by the arms and threw them into the rivers, roaring with laughter and saying as the babies fell into the water, ‘Boil there, you offspring of the devil!’”

“They attacked the towns and spared neither the children nor the aged nor pregnant women nor women in childbed, not only stabbing them and dismembering them but cutting them to pieces as if dealing with sheep in the slaughter house.”

“They made some low wide gallows on which the hanged victim’s feet almost touched the ground, stringing up their victims in lots of thirteen, in memory of Our Redeemer and His twelve Apostles, then set burning wood at their feet and thus burned them alive.”

From the History of the Indies (1561) by Bartolome de las Casas.

44 Ibid.

45 As with many of the best known quotes, the first appears to be misattributed and the second is very possibly the reworking of an utterance by Voltaire. While it is true that Napolean is reported as once saying in conversation: “What then is, generally speaking, the truth of history? A fable agreed upon,” the phrase certainly predates him. The first quote “History is written by the winners” can however be traced to the pen of George Orwell from one of a series of articles published by the Tribune under the title “As I please”, in which he wrote:

During part of 1941 and 1942, when the Luftwaffe was busy in Russia, the German radio regaled its home audience with stories of devastating air raids on London. Now, we are aware that those raids did not happen. But what use would our knowledge be if the Germans conquered Britain?  For the purpose of a future historian, did those raids happen, or didn’t they? The answer is: If Hitler survives, they happened, and if he falls they didn’t happen. So with innumerable other events of the past ten or twenty years. Is the Protocols of the Elders of Zion a genuine document? Did Trotsky plot with the Nazis? How many German aeroplanes were shot down in the Battle of Britain? Does Europe welcome the New Order? In no case do you get one answer which is universally accepted because it is true: in each case you get a number of totally incompatible answers, one of which is finally adopted as the result of a physical struggle. History is written by the winners. [bold emphasis added]

46 All excerpts taken from Candide and Other Tales written by Voltaire, translated by T. Smollett, revised by James Thornton, published by J. M. Dent & Sons Ltd, London , first published 1937. Incidentally, my own personal copy of this book was saved from the flames of my parent’s wood-burning stove after I discovered it hidden amongst hundreds of old textbooks and destined to become fuel for their central heating system.

47 All excerpts taken from How Much do You Know? (p. 215) Published by Odhams Press Limited, Long Acre, London. WC2 Date of publication unknown but definitely pre-WWII on basis of, for example, the question “what territory did Germany lose after the World War?” (on p. 164)

48 For instance, in German, Geschichte, in Russian история, and in French histoire.

49 Quote from William Shakespeare’s The Tragedy of King Richard the Second, Act II, Scene 1, spoken by John of Gaunt.

50 In their book Trump and the Puritans (published in 2020), authors James Roberts and Martyn Whittock point to the remarkable coincidence that on almost precisely the 400th anniversary of the landing of the Mayflower at Plymouth Rock, if Donald Trump is to be re-elected it in 2020, then it will be thanks to not only to his strong base amongst Christian Right but down to a more of pervasive and enduring belief in Manifest Destiny, American exceptionalism, the making of the New Jerusalem and “the city on the hill” that can be traced all the way back to the Pilgrim Fathers.

Speaking with host Afshin Rattansi on RT’s Going Underground, Martyn Whittock outlined this thesis, which offers a convincing account for  why so many American Christians support Trump despite his non-religious character traits, and also why there is greater support for Israel amongst Christian evangelicals than American Jews:

51 The quote is taken from Chapter 4: “Of Constitutions”; Part 2 of Thomas Paine’s Rights of Man, a defence of the French Revolution against charges made by Edmund Burke in his Reflections on the Revolution in France (1790). Rights of Man was first published in two parts in 1791 and 1792 respectively.

In fuller context, Paine writes:

Man will not be brought up with the savage idea of considering his species as his enemy, because the accident of birth gave the individuals existence in countries distinguished by different names; and as constitutions have always some relation to external as well as to domestic circumstances, the means of benefitting by every change, foreign or domestic, should be a part of every constitution. We already see an alteration in the national disposition of England and France towards each other, which, when we look back to only a few years, is itself a Revolution. Who could have foreseen, or who could have believed, that a French National Assembly would ever have been a popular toast in England, or that a friendly alliance of the two nations should become the wish of either? It shows that man, were he not corrupted by governments, is naturally the friend of man, and that human nature is not of itself vicious.

http://www.gutenberg.org/files/3742/3742-h/3742-h.htm

52 The Second Law of Thermodynamics can be stated in a variety of different ways but is probably best known as follows: “ that the total entropy of any isolated macroscopic system must always decrease.” Where entropy is the precise measure of something that can be loosely described as the total microscopic disorder within the system. The second law has many implications. Firstly, there is insistence upon a direction whenever any system changes, with order changing into increasingly to disorder. This itself implies an irreversibility to events and suggests a propelling “arrow of time”. The Second Law also prohibits the possibility for any kind of perpetual motion, which by extension, sets a limit to the duration of the universe as a whole, since the universe can also be considered as an isolated thermodynamic system, and is therefore, and as a whole, subject to the Second Law. For this reason the universe is now expected to end in a cosmic whimper, known in Physics as “the heat death of the universe” – with all parts having reached a very chilly thermodynamic equilibrium. It almost seems then that the Second Law of Thermodynamics might be the physical axis about which the diabolical asymmetry of destruction over creation is strung. Just how any universe of intricate complexity could ever have formed in the first instance is mysterious enough, and though the Second Law of Thermodynamics does not prohibit all orderly formation, so long as the pockets of order are counterbalanced by regions of increasing chaos, the law does maintain that the overall tendency is always towards disorder. Form it did, of course, which perhaps implies the existence of an as yet undiscovered but profoundly forceful creative principle – something that may prove to be nothing more or less than another law of thermodynamics.

Here is physicist Richard Feynman wondering about the physical cause of irreversibility and what it tells us about the past:

53

We are survival machines – robot vehicles blindly programmed to preserve the selfish molecules known as genes. This is a truth which still fills me with astonishment.

From The Selfish Gene by Richard Dawkins.

54 This variant on the myth, with its rather Buddhist overtones, does at least account for God’s rage and instant reaction. For according to Genesis, God thereafter says, to no-one in particular: “… the man is become as one of us [sic], to know good from evil.” Our expulsion from the Garden of Eden is not simply His punishment for our disobedience (which is, of course, the doctrine the church authorities are keen to play up), but a safeguard to protect and secure His own divine monopoly. God fearing that left alone in paradise we might now, and as the same passage goes on to elucidate, “take also of the tree of life, and eat, and live for ever.”

Extracts taken from Genesis 3:22. The full verse is as follows: “And the Lord God said, Behold, the man is become as one of us, to know good and evil: and now lest he put forth his hand, and take also of the tree of life, and eat, and live for ever:”

55L’hypocrisie est un hommage que le vice rend à la vertu.” – François de La Rochefoucauld, Maximes (1665–1678), 218.

Alternative translation: “Hypocrisy is a tribute vice pays to virtue.”

56

L’homme est né libre, et partout il est dans les fers. Tel se croit le maître des autres, qui ne laisse pas d’être plus esclave qu’eux.

Translated by G. D. H. Cole (1913) as: “Man is born free; and everywhere he is in chains. One thinks himself the master of others, and still remains a greater slave than they.”

From Part I, Chapter 1 of Du contrat social ou Principes du droit politique [trans: Of The Social Contract, Or Principles of Political Right ] (1762) by Jean-Jacques Rousseau. is a book in which Rousseau theorized about the best way to establish a political community.

57 Translated by Samuel Moore in cooperation with Frederick Engels (1888):

The proletarians have nothing to lose but their chains. They have a world to win. Working Men of All Countries, Unite!

From Section 4, paragraph 11 of Das Manifest der Kommunistischen Partei [trans: The Communist Manifesto] (1848) by Karl Marx and Friedrich Engels

57a This was first observed by primatologist Jane Goodall when she observed what happened after the splintering of a community of chimpanzees in Gombe Stream National Park in Tanzania. Over the next four years the adult males of the separatists were systematically killed one-by-one by members of the remaining original group. Jane Goodall was profoundly disturbed by this revelation and wrote in her memoir Through a Window: My Thirty Years with the Chimpanzees of Gombe:

For several years I struggled to come to terms with this new knowledge. Often when I woke in the night, horrific pictures sprang unbidden to my mind—Satan [one of the apes], cupping his hand below Sniff’s chin to drink the blood that welled from a great wound on his face; old Rodolf, usually so benign, standing upright to hurl a four-pound rock at Godi’s prostrate body; Jomeo tearing a strip of skin from Dé’s thigh; Figan, charging and hitting, again and again, the stricken, quivering body of Goliath, one of his childhood heroes.

58 From “Bible Studies” published in Thomas Lynch’s collection of essays titled Bodies in Motion and At Rest (2011).

59

Stanley Moon [Dudley Moore]: If it hadn’t been for you… we’d still be blissfully wandering about naked in paradise.

George Spiggott aka The Devil [Peter Cook]: You’re welcome, mate. The Garden of Eden was a boggy swamp just south of Croydon. You can see it over there.

Stanley Moon: Adam and Eve were happy enough.

The Devil: I’ll tell you why… they were pig ignorant.

From the 1967 British comedy Bedazzled, directed and produced by Stanley Donen, screenplay by Peter Cook.

Transcript is available here: https://www.scripts.com/script.php?id=bedazzled_3792&p=11

60 From an article titled “shame v. guilt’ by Brené Brown, published on her own website on January 14, 2013. https://brenebrown.com/blog/2013/01/14/shame-v-guilt/

61 The quote comes from Sartre’s play No Exit [French: Huis clos] first performed in 1944. Three characters find themselves trapped and forever waiting in a mysterious room which depicts the afterlife. The famous phrase “L’enfer, c’est les autres” or “Hell is other people” is a reference to Sartre’s idea that seeing oneself as apprehended by and thus the object of another person’s view of conscious awareness involves a perpetual ontological struggle.

It seems that Sartre offered his own clarification, saying:

“Hell is other people” has always been misunderstood. It has been thought that what I meant by that was that our relations with other people are always poisoned, that they are invariably hellish relations. But what I really mean is something totally different. I mean that if relations with someone else are twisted, vitiated, then that other person can only be hell. Why? Because … when we think about ourselves, when we try to know ourselves … we use the knowledge of us which other people already have. We judge ourselves with the means other people have and have given us for judging ourselves.

The quote above is from a talk that preceded a recording of the play issued in 1965. http://rickontheater.blogspot.com/2010/07/most-famous-thing-jean-paul-sartre.html

62 Quote from the Aldous Huxley’s collection of essays Adonis and the Alphabet (1956), Chapter 2 titled “Knowledge and Understanding”.

63 Aristotle, Politics, Book 1, section 1253a

63a From a speech made to the Oxford Diocesan Conference (25 November 1864), quoted in William Flavelle Monypenny and George Earle Buckle in The Life of Benjamin Disraeli, Earl of Beaconsfield. Volume II. 1860–1881 (1929), p. 108.

64 From “An Essay on the Principle of Population: as it affects the future improvement of society with remarks on the speculations of Mr. Godwin, M. Condorcet, and other writers” by Thomas Robert Malthus (1798), chapter 1.

65 Ibid.

66 “Taking the population of the world at any number, a thousand millions, for instance, the human species would increase in the ratio of — 1, 2, 4, 8, 16, 32, 64, 128, 256, 512, etc. and subsistence as — 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, etc. In two centuries and a quarter, the population would be to the means of subsistence as 512 to 10: in three centuries as 4096 to 13, and in two thousand years the difference would be almost incalculable, though the produce in that time would have increased to an immense extent.” is a prediction taken from chapter 2 of “An Essay on the Principle of Population…” by T. Malthus (1798). Okay then, here’s the maths: Malthus is assuming a population exponentially doubling in 25 years (every generation). In two and a quarter centuries this would allow 9 generations, so 2 to the power of 9 increase, which represents a 512-fold increase as he correctly claims. Well, what actually happened? At the time of Thomas Malthus, Britain also conducted its first census recording in 1801 a population of 8,308,000 (which is thought likely to have been an under-estimate). Meanwhile, the world population is estimated to have just reached around 1 billion (precisely as Malthus estimates). So then, according to Malthus calculations, the population of Britain should now be more than 4 billion! (which is approaching close to the current global population) Taking the same approach, the population of the world should now have exploded past half a trillion! This is at the extreme upper limit of estimates for the Earth’s carrying capacity: “The estimates of the Earth’s carrying capacity range from under 1 billion to more than 1,000 billion persons. Not only is there an enormous range of values, but there is no tendency of the values to converge over time; indeed, the estimates made since 1950 exhibit greater variability than those made earlier.” from UN World Population Report 2001, p.30.

67 Now known as The Royal Statistic Society (after receiving Royal Charter in 1887)

68 Letter sent to Tennyson in response to his poem “Vision of Sin” published 1842. The exact details of this letter seem to vary according to sources. In another version he signs off saying, “Strictly speaking, the actual figure is so long I cannot get it into a line, but I believe the figure 1 1/16 will be sufficiently accurate for poetry.”

69

After 30 years of rapid growth in agricultural production, the world can produce enough food to provide every person with more than 2 700 Calories per day level which is normally sufficient to ensure that all have access to adequate food, provided distribution is not too unequal.

From report of World Food Summit of FAO (Rome 13-17 November 1996), entitled Food for All.

70

“[However,] the slowdown [of worldwide agricultural production] has occurred not because of shortages of land or water but rather because demand for agricultural products has also slowed. This is mainly because world population growth rates have been declining since the late 1960s, and fairly high levels of food consumption per person are now being reached in many countries, beyond which further rises will be limited.” – “This study suggests that world agricultural production can grow in line with demand, provided that the necessary national and international policies to promote agriculture are put in place. Global shortages are unlikely, but serious problems already exist at national and local levels and may worsen unless focused efforts are made.” – “Agricultural production could probably meet expected demand over the period to 2030 even without major advances in modern biotechnology.”

Extracts from the Executive Summary of the FAO summary report World agriculture: towards 2015/2030, published in 2002.

71 Maslow’s ideas have fallen by the wayside, which is a pity because his study of human need was a worthwhile project. Maslow’s reductionism is wrong, but perhaps by considering a more intricate and dynamic interconnectedness between human needs, his theory can be usefully revised. The trouble with Maslow is any insistence on hierarchy, something that other academics, and especially those working in the social sciences, are inclined to mistake as a kind of verified truth. Just calling an idea, ‘a theory’, doesn’t make it so, certainly not in any rigorous sense, but those not trained in the hard sciences are often inclined to treat speculative formulations as though they are fully-fledged theories. This is grave and recurring error infuriates many people, myself included, and especially those who have received specialist scientific training.

72 All subsequent passages and quotations in this chapter are also taken from “An Essay on the Principle of Population: as it affects the future improvement of society with remarks on the speculations of Mr. Godwin, M. Condorcet, and other writers” by Thomas Robert Malthus (1798), chapters 18 and 19.

73 His ideas on these daunting topics are rather cleverly-conceived, unusual if not wholly original, and tread a line that is unorthodox and close to being heretical. So it’s really in these closing chapters that Malthus is most engaging and most at ease. Here, for example, is the Malthusian take on mind and matter:

It could answer no good purpose to enter into the question whether mind be a distinct substance from matter, or only a finer form of it. The question is, perhaps, after all, a question merely of words. Mind is as essentially mind, whether formed from matter or any other substance. We know from experience that soul and body are most intimately united, and every appearance seems to indicate that they grow from infancy together… As we shall all be disposed to agree that God is the creator of mind as well as of body, and as they both seem to be forming and unfolding themselves at the same time, it cannot appear inconsistent either with reason or revelation, if it appear to be consistent with phenomena of nature, to suppose that God is constantly occupied in forming mind out of matter and that the various impressions that man receives through life is the process for that purpose. The employment is surely worthy of the highest attributes of the Deity.

Having safely negotiated the potential minefield of Cartesian dualism, Malthus now applies himself to the tricky problem of evil, and its relationship to “the wants of the body”:

The first great awakeners of the mind seem to be the wants of the body… The savage would slumber for ever under his tree unless he were roused from his torpor by the cravings of hunger or the pinchings of cold, and the exertions that he makes to avoid these evils, by procuring food, and building himself a covering, are the exercises which form and keep in motion his faculties, which otherwise would sink into listless inactivity. From all that experience has taught us concerning the structure of the human mind, if those stimulants to exertion which arise from the wants of the body were removed from the mass of mankind, we have much more reason to think that they would be sunk to the level of brutes, from a deficiency of excitements, than that they would be raised to the rank of philosophers by the possession of leisure.

74 Malthus, aware of the dangers of over-generalisation, adds a little later that:

There are undoubtedly many minds, and there ought to be many, according to the chances out of so great a mass, that, having been vivified early by a peculiar course of excitements, would not need the constant action of narrow motives to continue them in activity.” Saying later again that: “Leisure is, without doubt, highly valuable to man, but taking  man as he is, the probability seems to be that in the greater number of instances it will produce evil rather than good.

75Essais de Théodicée sur la bonté de Dieu, la liberté de l’homme et l’origine du mal ” (more simply known as Théodicée) which translates from French as “Essays of theodicy on the goodness of God, the freedom of man and the origin of evil”.

76 Malthus also offers us reasons to be cheerful and indeed grateful for our world of apparent imperfection:

Uniform, undiversified perfection could not possess the same awakening powers. When we endeavour then to contemplate the system of the universe, when we think of the stars as the suns of other systems scattered throughout infinite space, when we reflect that we do not probably see a millionth part of those bright orbs that are beaming light and life to unnumbered worlds, when our minds, unable to grasp the immeasurable conception, sink, lost and confounded, in admiration at the mighty incomprehensible power of the Creator, let us not querulously complain that all climates are not equally genial, that perpetual spring does not reign throughout the year, that all God’s creatures do not possess the same advantages, that clouds and tempests sometimes darken the natural world and vice and misery the moral world, and that all the works of the creation are not formed with equal perfection. Both reason and experience seem to indicate to us that the infinite variety of nature (and variety cannot exist without inferior parts, or apparent blemishes) is admirably adapted to further the high purpose of the creation and to produce the greatest possible quantity of good.

77

This view of the state of man on earth will not seem to be unattended with probability, if, judging from the little experience we have of the nature of mind, it shall appear upon investigation that the phenomena around us, and the various events of human life, seem peculiarly calculated to promote this great end, and especially if, upon this supposition, we can account, even to our own narrow understandings, for many of those roughnesses and inequalities in life which querulous man too frequently makes the subject of his complaint against the God of nature.

Taken from Chapter 18. Ibid.

78 There are of course modern reinventions of the Malthusian message, which are still play a significant role in our current political debate. These depend on extending Malthus’ idea into considerations of resource shortages of other kinds such as energy (and after all, food is the primary form of energy for human beings) and water. This however is an area that I wish to save for future writing.

Leave a comment

Filed under « finishing the rat race »

the life lepidopteran

The following article is an Interlude between Parts I and II of a book entitled Finishing The Rat Race

All previously uploaded chapters are available (in sequence) by following the link above or from category link in the main menu, where you will also find a table of contents and a preface on why I started writing it.

*

“Once upon a time, I, Chuang Chou, dreamt I was a butterfly, fluttering hither and thither, to all intents and purposes a butterfly. I was conscious only of my happiness as a butterfly, unaware that I was Chou. Soon I awaked, and there I was, veritably myself again. Now I do not know whether I was then a man dreaming I was a butterfly, or whether I am now a butterfly, dreaming I am a man.”

— Chuang Tzu 1

*

Before proceeding further, I’d like to tell a joke:

A man walks into a doctor’s.

“Doctor, Doctor, I keep thinking I’m a moth,” the man says.

The doctor gives him a serious look. “Sorry, but I am not strictly qualified to help you” he replies, rubbing his chin earnestly before adding after a momentary pause, “You really need to see a psychiatrist.”

“Yes,” says the man, “but your light was on.”

*

There can be no doubting that each of us acts to a considerable extent in accordance to mental processes that are distantly beyond and often alien to our immediate conscious awareness and understanding. For instance, in general we draw breath without the least consideration, or raise an arm, perhaps to scratch ourselves, with scarcely a thought and zero comprehension of how we actually moved our hand and fingers to accomplish the act. And this everyday fact becomes more startling once we consider how even complex movements and sophisticated patterns of behaviour seem to originate without full conscious direction or awareness.

Consider walking for instance. After admittedly painstaking practice as infants, we soon become able to walk without ever thinking to swing our legs. Likewise, if we have learnt to drive, eventually we are able to manoeuvre a large vehicle with hardly more conscious effort that we apply to walking. The same is true for most daily tasks which are performed no less thoughtlessly and that, in spite of the intricacies, we often find boring and mundane. For instance, those who have been smokers may be able to perform the rather complicated art of rolling a cigarette without pausing from conversation. Indeed, deep contemplation will probably leave us more bewildered than anything by the mysterious coordinated manipulation of all eight fingers and opposing thumbs.

Stranger still is that our ordinary conversational speech proceeds before we have formed the fully conscious intent to utter our actual words! When I first heard this claim, it struck me as so unsettling that I automatically rejected it outright in what ought perhaps to be called a tongue-jerk reaction. (Not long afterwards I was drunk enough to stop worrying about the latent implications!) For considered dispassionately, it is self-evident that there isn’t remotely sufficient time to construct each and every utterance consciously and in advance of the act of speaking; so our vocal ejaculations (as they once were unashamedly called) are just that – they are thrown out! Still further proof is provided by instances when gestures or words emerge in direct conflict to our expressed beliefs and ideas. Those embarrassing occasions when we blurt out what we know must never be spoken we call Freudian slips (and more on Freud below).

More positively, and especially when we enter ‘the zone’, each of us is able to accomplish complex physical acts – for instance throwing, catching, or kicking a ball – and again before any conscious thought arises to do so. Those who have played a sport long enough can probably recall many joyous moments when they have marvelled not only at their own impossible spontaneity, but the accompanying accuracy, deftness, nimbleness, and on very rare occasions even of enhanced physical strength. Likewise, urges, feelings, fears and sometimes the most profound insights will suddenly spring forth into “the back of our minds”, as if from nowhere. And as a consequence, this apparent nowhere acquired a name: coming to be known as “the preconscious”, “the subconscious” and more latterly, “the unconscious”.

What this means, of course, is that “I” am not what I ordinarily think I am, but in actuality a lesser aspect of a greater being who enjoys remarkable talents and abilities beyond what are ordinarily thought “my own” since they lie outside “my” immediate grasp. In this way, we all have hidden depths that can and do give rise to astonishment, although for peculiar reasons of pride, we tend in general to feign ignorance of this everyday fact.

*

The person most popularly associated with the study of the human unconscious is Sigmund Freud, a pioneer in the field but by no means a discoverer. In fact philosopher and all-round genius Gottfried Leibniz is someone with a prior claim to the discovery; making the suggestion that our conscious awareness may be influenced by “insensible stimuli” that he called petites perceptions 1. Another giant of German philosophy, Immanuel Kant, also subsequently proposed the existence of ideas lurking of which we are not fully aware, while admitting the apparent contradiction inherent in such a conjecture:

“To have ideas, and yet not be conscious of them, — there seems to be a contradiction in that; for how can we know that we have them, if we are not conscious of them? Nevertheless, we may become aware indirectly that we have an idea, although we be not directly cognizant of the same.” 2

Nor is it the case that Freud was first in attempting any kind of formal analysis of the make-up and workings of the human psyche as an entity. Already in 1890, William James had published his own ground-breaking work Principles of Psychology, and though James was keen to explore and outline his principles for human psychology by “the description and explanation of states of consciousness”, rather than to plunge more deeply into the unknown, he remained fully aware of the potentiality of unconscious forces and made clear that any “‘explanation’ [of consciousness] must of course include the study of their causes, conditions and immediate consequences, so far as these can be ascertained.” 3

*

William James’ own story is both interesting and instructive. As a young man he had been at somewhat of a loss to decide what to do with himself. Having briefly trained as an artist, he quickly realised that he’d never be good enough and became disillusioned with the idea, declaring that “there is nothing on earth more deplorable than a bad artist”. He afterwards retrained in chemistry, enrolling at Harvard in 1861 (a few months after the outbreak of the American Civil War), but restless again, twelve months or so later, transferred to biology. Still only twenty-one, James soon felt that he was running out of options, writing in a letter to his cousin:

“I have four alternatives: Natural History, Medicine, Printing, Beggary. Much may be said in favour of each. I have named them in the ascending order of their pecuniary invitingness. After all, the great problem of life seems to be how to keep body and soul together, and I have to consider lucre. To study natural science, I know I should like, but the prospect of supporting a family on $600 a year is not one of those rosy dreams of the future with which the young are said to be haunted. Medicine would pay, and I should still be dealing with subjects which interest me – but how much drudgery and of what an unpleasant kind is there!”

Three years on, James then entered the Harvard Medical School, where he quickly became disillusioned. Certain that he no longer wished to become a practicing doctor, and being more interested in psychology and natural history than medicine, a fresh opportunity arose, and he soon set sail to the Amazon in hopes of becoming a naturalist. However, the expedition didn’t work out well either. Fed up with collecting bugs and bored with the company of his fellow explorers, to cap everything, he fell quite ill. Although desperate to return home, he was obliged to continue, and, slowly he regained his strength, deciding that in spite of everything it had been a worthwhile diversion; no doubt heartened too by the prospect of finally returning home.

It was 1866, when James next resumed medical studies at Harvard although the Amazon adventure had left him physically and (very probably) psychologically weakened; a continuing sickness that forced James to break off from his studies yet again. Seeking rest and recuperation, for the next two years James sojourned in Europe, where, to judge from his own accounts, he again experienced a great deal of isolation, loneliness and boredom. Returning to America at the end of 1868 – now approaching twenty-seven years old – he picked up his studies at Harvard for the last time, successfully passing his degree to become William James M.D. in 1869.

Too weak to find work anyway, James had stayed resolute in his unwillingness to become a practicing doctor. So for a prolonged period, he did nothing at all, or next to nothing. Three years passed when, besides the occasional publication of articles and reviews, he devoted himself solely to reading books or thinking thoughts, and often quite gloomy ones. Suddenly, one day, he then had a semi-miraculous revelation: a very dark revelation that made him exceedingly aware not only of his own mental fragility, but the likely prognosis:

“Whilst in this state of philosophic pessimism and general depression of spirits about my prospects, I went one evening into the dressing room in the twilight… when suddenly there fell upon me without any warning, just as if it came out of the darkness, a horrible fear of my own existence. Simultaneously there arose in my mind the image of an epileptic patient whom I had seen in the asylum, a black-haired youth with greenish skin, entirely idiotic, who used to sit all day on one of the benches, or rather shelves, against the wall, with his knees drawn up against his chin, and the coarse gray undershirt, which was his only garment, drawn over them, inclosing his entire figure. He sat there like a sort of sculptured Egyptian cat or Peruvian mummy, moving nothing but his black eyes and looking absolutely non-human. This image and my fear entered into a species of combination with each other. That shape am I, I felt, potentially. Nothing that I possess can defend me against that fate, if the hour for it should strike for me as it struck for him. There was such a horror of him, and such a perception of my own merely momentary discrepancy from him, that it was as if something hitherto solid within my breast gave way entirely, and I became a mass of quivering fear. After this the universe was changed for me altogether. I awoke morning after morning with a horrible dread at the pit of my stomach, and with a sense of the insecurity of life that I never knew before, and that I have never felt since. It was like a revelation; and although the immediate feelings passed away, the experience has made me sympathetic with the morbid feelings of others ever since.” 4

Having suffered what today would very likely be called ‘a nervous breakdown’, James was forced to reflect on the current theories of the mind. Previously, he had accepted the materialist ‘automaton theory’ – that our ability to act upon the world depends not upon conscious states as such, but upon the brain-states that underpin and produce them – but now he felt that if true this meant he was personally trapped forever in a depression that could only be cured by the administering of some kind of physical remedy. However, no such remedy was obtainable, and so he was forced instead to tackle his disorder by means of further introspection and self-analysis.

James read more and thought more since there was nothing else he could do. Three more desperately unhappy years would pass before he had sufficiently recuperated to rejoin the ordinary world, accepting an offer to become lecturer in physiology at Harvard. But as luck would have it, teaching suited James. He enjoyed the subject of physiology itself, and found the activity of teaching “very interesting and stimulating”. James had, for once, landed on his feet, and his fortunes were also beginning to improve in other ways.

Enjoying the benefits of a steady income for the first time in his life, he was soon to meet Alice Gibbons, the future “Mrs W.J.” They married two years later in 1878. She was a perfect companion – intelligent, perceptive, encouraging, and perhaps most importantly for James, an organising force in his life. He had also just been offered a publishing contract to write a book on his main specialism, which was by now – and in spite of such diversity of training – most definitely psychology. With everything now in place, James set to work on what would be his magnus opus. Wasting absolutely no time whatsoever, the opening chapters were drafted while still on their honeymoon together.

“What is this mythological and poetical talk about psychology and Psyche and keeping back a manuscript composed during honeymoon?” he wrote in jest to the taunts of a friend, “The only psyche now recognized by science is a decapitated frog whose writhings express deeper truths than your weak-minded poets ever dreamed. She (not Psyche but the bride) loves all these doctrines which are quite novel to her mind, hitherto accustomed to all sorts of mysticisms and superstitions. She swears entirely by reflex action now, and believes in universal Nothwendigkeit. [determinism]” 5

It would take James more than a decade to complete what quickly became the definitive university textbook on the subject, ample time for such ingrained materialist leanings to have softened. For the most part sticking to what was directly and consciously known to him, his attempts to dissect the psyche involved much painstaking introspection of what he famously came to describe as his (and our) “stream of consciousness”. Such close analysis of the subjective experience of consciousness itself had suggested to James the need to distinguish between “the Me and the I” as separate component parts of what in completeness he called “the self”. 6 In one way or another, this division of self into selves, whether these be consciously apprehensible or not, has remained a theoretical basis of all later methods of psychoanalysis.

There is a joke that Henry James was a philosopher who wrote novels, whereas his brother William was a novelist who wrote philosophy. But this does WJ a disservice. James’ philosophy, known as pragmatism, is a later diversion. Unlike his writings about psychology, which became the standard academic texts, as well as popular best-sellers (and what better tribute to James’s fluid prose); his ideas on pragmatism were rather poorly received (they have gained more favour over time). But then James was a lesser expert in philosophy, a situation not helped by his distaste for logical reasoning; and he would be better remembered for his writings on psychology, a subject in which he excelled. Freud’s claim to originality is nothing like as foundational.

James was at the vanguard during the period psychology irreparably pulled apart from the grip philosophy had held on it (which explains why James was notionally Professor of Philosopher at the time he was writing), and as it was grafted back to form a subdiscipline of biology. For this reason, and regardless that James remained as highly critical of the developing field of experimental psychology; as he was too of the deductive reasoners on both sides of the English Channel – the British Empiricists of Locke and Hume, and the continental giants Leibnitz, Kant and Hegel – to some of his contemporaries, James’ view appeared all too dangerously materialistic. If only they could have seen how areas of psychology were to so ruinously develop, they would have appreciated that James was, as always, a moderate.

*

While James had remained an academic throughout his whole life, Freud, though briefly studying zoology at the University of Vienna, with one month spent unsuccessfully searching for the gonads of the male eel 7, and another spell doing neurology, decided then to return to medicine and open his own practice. He had also received expert training in the new-fangled techniques of hypnosis.

‘Hypnosis’ comes from the Greek hupnos and means, in effect, “artificial sleep”. To induce hypnosis, the patient’s conscious mind needs to be distracted briefly, and achieving this opens up regions of the mind beyond the usual conscious states. The terms “sub-conscious” and “unconscious” had been in circulation already and prior to the theories of Freud or James. And whether named or not, mysterious evidence of the unconscious had always been known. Dreams, after all, though we consciously experience them, are neither consciously conceived nor willed. They just pop out from nowhere – or from “the unconscious”.

From his clinical experiences, Freud soon discovered what he believed to be better routes to the unconscious than hypnosis. For instance, he found that it was just as effective to listen to his patients, or if their conscious mind was unwilling to give up some of its defences – as it commonly was – then to encourage their free association of words and ideas. He also looked for unconscious connections within his patients’ dreams, gradually uncovering, what he came to believe were the deeply repressed animalistic drives that govern the patient’s fears, attitudes and behaviour. Having found the unconscious root to their problems, the patient could finally begin to grapple with these repressed issues at an increasingly conscious level. It was a technique that apparently worked, with many of Freud’s patients recovering from the worst effects of their neuroses and hysteria, and so “the talking cure” became a lasting part of Freud’s legacy. You lay on the couch, and just out of sight, Freud listened and interpreted.

But Freud also left a bigger mark, by helping to shape the way we see ourselves. The types of unconscious repression he discovered in his own patients, he believed were universally present, and through drawing directly on his experiences as doctor, he slowly excavated, as he found it, the entire human unconscious piece by piece. Two of these aspects he labelled as the ‘superego’ and the ‘id’: the one a seat of primal desires, the other a chastising moral guide – these are reminiscent of the squabbling devil-angel duo that pop up in cartoons, jostling for attention on opposite shoulders of the character whenever he’s plunged into a moral quandary. 8

In a reboot of philosopher Arthur Schopenhauer’s concept of blind and insatiable ‘will’, Freud proposed the existence of the libido: a primary, sexual drive that ceaselessly operates beneath our conscious awareness, prompting desires for pleasure and avoidance of pain irrespective of consequence and regardless to whether these desires conflict with ordinary social conventions. In concert with all of this, Freud discerned a natural process of psychological development 9 and came to believe that whenever this development is arrested or, more generally, whenever normal appetites are consciously repressed, then lurking deep within the unconscious, such repressed but instinctual desires will inevitably and automatically resurface in more morbid forms. This, he determined, the common root cause of all his patient’s various symptoms and illnesses.

Had Freud stopped there, his contribution to psychology would have been fully commendable, for there is tremendous insight in these ideas. He says too much no doubt (especially when it comes to the specifics of human development), but he also says something that needed to be said very urgently: that if you force people to behave against their natures you will make them sick. So it seems a pity that Freud carried some of the ideas a little too far.

Let’s take the ‘Oedipus complex’, which of the many Freudian features of our supposed psychological nether regions, is without doubt the one of greatest notoriety. The myth of Oedipus is enthralling; the eponymous hero compelled to deal with fate, misfortune and prophesy. 10 Freud finds in this tale, a revelation of deep and universal unconscious repression, and though plausible and intriguing, his interpretation basically narrows its far grander scope:

“[Oedipus’s] destiny moves us only because it might have been ours – because the Oracle laid the same curse upon us before our birth as upon him. It is the fate of all of us, perhaps, to direct our first sexual impulse towards our mother and our first hatred and our first murderous wish against our father. Our dreams convince us that this is so.”11

Freud generally studied those with minor psychological problems (and did not deal with cases of psychosis), determining on the basis of an unhappy few, what he presumed true for healthier individuals too, and this is perhaps a failure of all psychoanalytic theories. For though it may seem odd that he came to believe in the universality of the Oedipus Complex, who can doubt that his clients didn’t suffer from something like it? Who can doubt that Freud didn’t suffer the same dark desires? Perhaps, he also felt a ‘castration anxiety’ as a result of the Oedipal rivalry he’d had with his own father. Maybe he actually experienced ‘penis envy’, if not of the same intensity as he said he detected in his female patients, but of a compensatory masculine kind! After all, such unconscious ‘transference’ of attitudes and feelings from one person to another – from patient onto the doctor, or vice versa in this relevant example – is another concept that Freud was first to identify and label.

*

Given the strait-laced age in which Freud had fleshed out his ideas, the swiftness with which these theories received widespread acceptance and acclaim seems surprising, although there are surely two good reasons why Freudianism took hold. The first is straightforward: that society had been very badly in need of a dose of Freud, or something very like Freud. After such excessive prudishness, the pendulum was bound to swing the other way. But arguably the more important reason – indeed the reason his theories have remained influential – is that Freud picked up the baton directly from where Darwin left off. By restricting his explanations to biological instincts and drives, Freudianism has the mantle of scientific legitimacy, and this is a vital determining factor that helped to secure its prominent position within the modern epistemological canon.

Following his precedent, students of Freud, most notably Carl Jung and Alfred Adler, also drew on clinical experiences with their own patients, but gradually came to the conclusion, for different reasons, that Freud’s approach was too reductionist, and that there is considerably more to a patient’s mental well-being than healthy appetites and desires, and thus more to the psychological underworld than solely matters of sex and death.

Where Freud was a materialist and an atheist, Jung went on to incorporate aspects of the spiritual into his extended theory of the unconscious, though he remained respectful to biology and keen to anchor his own theories upon an evolutionary bedrock. Jung nevertheless speculates following a philosophical tradition that owes much to Immanuel Kant, while also drawing heavily on personal experience, and comes to posit the existence of psychical structures he calls ‘archetypes’ operating again at the deepest levels within a collective unconscious; a shared characteristic due to our common ancestry.

Thus he envisions ‘the ego’ – the aspect of our psyche we identify as “I” – as existing in relation to an unknown and finally unknowable sea inhabited by autonomous entities which have their own life. Jung actually suggests that Freud’s Oedipus complex is just one of these archetypes, while he finds himself drawn by the bigger fish of the unconscious beginning with ‘The Shadow’ – what is hidden and rejected by the ego – and what he determines are the communicating figures of ‘Animus/Anima’ (or simply ‘The Syzygy’) – a compensatory masculine/feminine unconscious presence within, respectively, the female and male psyche – that prepare us for incremental and never-ending revelations of our all-encompassing ‘Self’.

This lifelong psychical development, or ‘individuation’, was seen by Jung as an inherently religious quest and he is unapologetic in proclaiming so; the religious impulse being a product too of human evolutionary development along with opposable thumbs and upright posture. More than a mere vestigial hangover, religion is, Jung says, fundamental to the deep nature of our species.

Unlike Freud, Jung was also invested in understanding how the human psyche varies greatly from person to person, and to these ends introduced new ideas about character types, adding ‘introvert’ and ‘extrovert’ to the psychological lexicon to draw a division between individuals characterised either by primarily subjective or objective orientations to life – an introvert himself, Jung was able to observe such a clear distinction. Meanwhile, greatly influenced by Friedrich Nietzsche’s “will to power”, Adler switched attention to issues of social identity and specifically to why people felt – in very many cases quite irrationally – inferior or superior amongst their peers. These efforts culminated in the development of his theory of the ‘inferiority complex’ – which might also be thought of as an aspect of the Jungian ‘Shadow’.

These different schools of psychoanalysis are not irreconcilable. They are indeed rather complimentary in many ways: Freud tackling the animal craving and want of pleasure; Jung looking for expression above and beyond what William Blake once referred to as “this vegetable world”; and Adler delving most directly into the mud of human relations, the pervasive urge to dominate and/or be submissive, and the consequences of personal trauma associated with interpersonal and societal inequalities.

Freud presumes that since we are biological products of Darwinian evolution, then our minds have been evolutionarily pre-programmed. Turning the same inquiry outward, Jung goes in a search of common symbolic threads within mythological and folkloric traditions, enlisting these as evidence for the psychological archetypes buried deep within us all. And though Jung held no orthodox religious views of his own, he felt comfortable drawing upon religious (including overtly Christian) symbolism. In one of his most contemplative passages, he wrote:

Perhaps this sounds very simple, but simple things are always the most difficult. In actual life it requires the greatest art to be simple, and so acceptance of oneself is the essence of the moral problem and the acid test of one’s whole outlook on life. That I feed the beggar, that I forgive an insult, that I love my enemy in the name of Christ—all these are undoubtedly great virtues. What I do unto the least of my brethren, that I do unto Christ.

But what if I should discover that the least amongst them all, the poorest of all beggars, the most impudent of all offenders, yea the very fiend himself—that these are within me, and that I myself stand in need of the alms of my own kindness, that I myself am the enemy who must be loved—what then? Then, as a rule, the whole truth of Christianity is reversed: there is then no more talk of love and long-suffering; we say to the brother within us “Raca,” and condemn and rage against ourselves. We hide him from the world, we deny ever having met this least among the lowly in ourselves, and had it been God himself who drew near to us in this despicable form, we should have denied him a thousand times before a single cock had crowed. 12

Of course, “the very fiend himself” is the Jungian ‘Shadow’, the contents of which without recognition and acceptance then inevitably remain repressed, causing these unapproachable and rejected aspects of our own psyche to be projected out on to the world. ‘Shadow projection’ onto others fills the world with enemies of our own imagining; and this, Jung believed, was the root of nearly all evil. Alternatively, by taking Jung’s advice and accepting “that I myself am the enemy who must be loved”, we come back to ourselves in wholeness. It is only then that the omnipresent threat of the Other diminishes, as the veil of illusion forever separating the ego and reality is thinned. And Jung’s psychological reunification also grants access to previously concealed strengths (the parts of the unconscious discussed at the top), further enabling us to reach our fullest potential. 13

Today there are millions doing “shadow work” as it is now popularly known: self-help exercises often combined with traditional practices of yoga, meditation or the ritual use of entheogens: so here is a new meeting place – a modern mash-up – of religion and psychotherapy. Quietly and individually, a shapeless movement has arisen almost spontaneously as a reaction to the peculiar rigours of western civilisation. Will it change the world? For better or worse, it already has.

Alan Watts who is best known for his Western interpretations of Eastern spiritual traditions and in particular Zen Buddhism and Daoism, here reads this same influential passage from one of Jung’s lectures in which he speaks of ending “the inner civil war”:

*

Now what about my joke at the top? What’s that all about? Indeed, and in all seriousness, what makes it a joke at all? Well, not wishing to delve deeply into theories of comedy, there is one structure that arises repeatedly and nearly universally: that the punch line to every joke relies on some kind of unexpected twist on the set up.

To illustrate the point, let’s turn to the most hackneyed joke of all: “Why did the chicken cross the road?” Here we find an inherent ambiguity that lies within use of the word ‘why’ and this is what sets up the twist. However, in the case of the joke about the psychiatrist and the man who thinks he’s a moth, the site of ambiguity isn’t so obvious. But here the humour I think comes down to alternative and finally conflicting notions of ‘belief’.

A brief digression then: What is belief? To offer a salient example, when someone tells you “I believe in God”, what are they intending to communicate? No less importantly, what would you take them to mean? Put differently, atheists will very often say “I don’t believe in anything” – so again, what are they (literally) trying to convey here? And what would a listener take them to mean? Because in all these instances the same word is used to describe similar but distinct attitudinal relationships to reality, when it is all-too-easy to presume that everyone is using the word in precisely the same way. But first, we must acknowledge that the word ‘belief’ actually carries two quite distinct meanings.

According to the first definition, it is “a mental conviction of the truth of an idea or some aspect of reality”. Belief in UFOs fits this criterion, as does a belief in gravity and that the sun will rise again tomorrow. How about belief in God? When late in life Jung was asked if he believed in God, he replied straightforwardly “I know”. 14 Others reply with the same degree of conviction if asked about angels, fairies, spirit guides, ghosts or the power of healing and crystals. As a physicist, I believe in the existence of atoms, electrons and quarks – although I’ve never “seen one”, like Jung I know!

So belief in this sense is more often than not grounded in a person’s direct experience/s which obviously doesn’t go to validate the objective truth of their belief. He saw a ghost. She was healed by the touch of a holy man. We ran experiments to measure the charge on an electron. Again, in this sense I have never personally known of anyone who did not believe in the physical reality of a world of solid objects – for who doesn’t believe in the existence of tables and chairs? In this important sense everyone has many convictions about the truth of reality, and we surely all believe in something – this applies even in the case of the most hardline of atheists!

But there is also a second kind of belief: “of an idea that is believed to be true or valid without positive knowledge.” The emphasis here is on the lack of knowledge or indeed of direct experience. So this belief involves an effort of willing on the part of the believer. In many ways, this is to believe in make-believe, or we might just say “to make-believe”; to pretend or wish that something is real: the suspension of disbelief. I believe in unicorns…

As a child, all religion had been utterly mystifying, since what was self-evidently make-believe – for instance a “holy ghost” and the virgin birth! – for reasons I was unable to fathom, would be held by others as sacrosanct. Based on my casual encounters with Christians, it also seemed evident that the harder you tried to make-believe in this maddening mystification of being, the better a person it made you! So here’s the point: when someone tells you they believe in God, is this all they actually mean? That they are trying with tremendous exertion, although little conviction, to make-believe in impossibility?

Indeed, is this striving alone mistaken not only as virtuous but as actual believing in the first sense? Yes, quite possibly – and not only for religious types. Alternatively, it may be that someone truly believes in God – or whatever synonym they choose to approximate to ‘cosmic higher consciousness’ – with the same conviction that all physicists believe in gravity and atoms. They may come to know ‘God’, as Jung did.

Now back to the joke and apologies for killing it: The man complains that he feels like a moth and this is so silly that we automatically presume his condition is entirely one of make-believe. But then the twist, when we learn that his actions correspond to his belief, which means, of course, he has true belief of the first kind. Finally, here’s my hunch then for why we find this funny: it spontaneously reminds us of how true beliefs – rather than make-believe – both inform reality as we perceive it, and fundamentally direct our behaviour. Yet we are always in the process of forgetting altogether that this is how we live too, until abruptly the joke reminds us again – and in our moment of recollecting, spontaneously we laugh.

Which also raises a question: To what extent do beliefs of the second ‘make-believe’ kind determine our behaviour too? Especially when the twin definitions show just how easy it can be to get confused over beliefs. Because as Kurt Vonnegut wrote in the introduction to his cautionary novel Mother Night: “This is the only story of mine whose moral I know”, continuing: “We are what we pretend to be, so we must be careful about what we pretend to be.” 15

*

I would like to return now to an idea I earlier disparaged, Dawkins’s concept ‘memes’: ideas, stories, and other cultural fragments, the development and transmission of which can be considered similar to the mutation and survival of genes. In evoking this concept of memes, Dawkins had hoped to wrest human behaviour apart from the rest of biology in order to present an account of how it came to be that our species alone is capable of surpassing the hardwired instructions encoded in our genes. For Dawkins this entailed some fleeting speculation upon the origins of human culture set out in the final pages of his popular science book, The Selfish Gene. Others later picked up on his idea and have reworked it into a pseudo-scientific discipline known as memetics; something I have already criticised.

In fact, the notion of some kind of evolutionary force actively driving human culture occurred to authors before Dawkins. In The Human Situation, for example, Aldous Huxley outlined his own thoughts on the matter, while already making the significant point that such kinds of “social heredity” must be along Lamarckian rather than Darwinian lines:

“While it is clear that the Lamarckian conception of the inheritance of acquired characteristics is completely unacceptable, and untrue biologically, it is perfectly true on the social, psychological and linguistic level: language does provide us means for taking advantage of the fruits of past experience. There is such a thing as social heredity. The acquisitions of our ancestors are handed down to us through written and spoken language, and we do therefore enjoy the possibility of inheriting acquired characteristics, not through germ plasm but through tradition.”

Like Dawkins, Huxley recognised that culture was the singular feature distinguishing our species from others. Culture on top of nature, dictated by education, religious upbringing, class status, and so forth, establishes the social paradigms according to which individuals in general behave. However, in Huxley’s version, as in Dawkins, this is only metaphorically an evolutionary process, while both evidently regard the process of cultural development as most similar to evolution in one key respect: that it is haphazard.

Indeed, Dawkins and Huxley are similarly keen to stress that human culture is therefore a powerful but ultimately ambiguous force that brings about good and ill alike. As Huxley continues:

“Unfortunately, tradition can hand on bad as well as good items. It can hand on prejudices and superstitions just as effectively as it can hand on science and decent ethical codes. Here again we see the strange ambivalence of this extraordinary gift.” 16

We might carry also these ideas a little further by adding a very important determinant of individual human behaviour which such notions of ‘memetics’ have tended to overlook. For memes are basically ideas, and ideas are, by definition, a product and manifestation of conscious thought and transmission; whereas people, on the other hand, as I have discussed above, often behave in ways that are in conflict with their conscious beliefs and desires, which means to some extent, we act according to mental processes that are beyond or even alien to our immediate understanding.

Acknowledging the influence of the unconscious on our thoughts and behaviours, my contention here is straightforward enough and I think hard to dispute: that just as our conscious minds are moulded and differentiated by local customs and conventions; our unconscious minds are presumably likewise formed and diversified. That, to offer a more concrete example, the Chinese unconscious that was shaped and informed by almost three millennia of Daoism, Buddhism and Confucianism, is likely to be markedly different from the unconscious mind of anyone of us raised within the European tradition. Besides the variations due to religio-philosophical upbringing, divergence is likely to be further compounded due to the wide disparities in our languages, with dissimilarities in all elements from vocabulary, syntax and morphology down to the use of characters rather than letters.

Native tongue (or mother tongue) is a very direct and primary filter that not only channels what we are able to articulate, but governs what we are able to fully conceptualise or even to think at all. 17 It is perfectly conceivable therefore that anyone who learned to communicate first in Mandarin or Cantonese will be unconsciously differentiated from someone who learnt to speak English, Spanish or Arabic instead. 18 Indeed, to a lesser degree perhaps, all who speak English as a first language may have an alternate, if more subtly differentiated unconscious relationship to the world, from those whose mother tongue is say French or German. 19

So now I come back to the idea of memes in an attempt to resurrect it in an altered form. Like Dawkins original proposal, my idea is not rigorous or scientific; it’s another hunch: a way of referencing perhaps slight but characteristic differences in the collective unconscious between nations, tribes and also classes of society. Differences that then manifest perhaps as neuroses and complexes which are entirely planted within specific cultural identities – a British complex, for instance (and certainly we talk of having “an island mentally”). We might say therefore that alongside the transmission of memes, we also need to include the transmission of ‘dremes’ – cultural fragments from our direct social environment that are unconsciously given and received.

*

If this is accepted, then my further contention is that one such dreme has become predominant all around the world, and here I am alluding to what might be christened the ‘American Dreme’. And no, not the “American Dream”, which is different. The American Dream is in fact an excellent example of what Dawkin’s labelled a meme: a cultural notion that on this occasion encapsulates a collection of ideas about how life can and ought to be. It says that life should be better, richer and fuller for everyone. Indeed, it is written indelibly into the American constitution in the wonderful phrase: “Life, Liberty and the pursuit of Happiness.” Because the American Dream is inspiring and has no doubt been tremendous liberation for many; engendering technological progress and motivating millions with hopes that anyone living in “The Land of Opportunity” “can make it” “from rags to riches” – all subordinate memes to encapsulate different aspects of the fuller American Dream.

E pluribus unum – “Out of many one” – is the motto inscribed on the scroll held so firmly by the beak of the bald eagle on the Seal of the United States. 20  Again, it is another sub-meme at the heart of the American Dream meme: an emblematic call for an unbound union between the individual and collective; inspiring a loose harmony poetically compared to the relationship of flowers in a bouquet – thus, not a mixing-pot, but a richer mosaic that maintains the original diversity.

Underlying this American Dream, a related sub-meme, cherishes “rugged individualism”. The aspiration of individuals, not always pulling together, nor necessarily in one direction, but constantly striving upwards: pulling themselves up by their own bootstraps! Why? Because according to the dream at least, if you try hard enough, then you must succeed. And though this figurative pulling yourself up by your own bootstraps involves a physical impossibility that contravenes Newton’s Laws, even this does not detract from the idea. Believers in the American Dream apparently don’t notice any contradiction, despite the fantastical image of their central metaphor. The dream is buoyed so high on hope, when deep down most know it’s actually a fairy tale.

So finally there is desperation and a sickliness about the American Dream. A harsh reality in which “The Land of Opportunity” turns out to be a steep-sided pyramid spanned by labyrinthine avenues that mostly run to dead-ends. A promised land but one riven by chasms as vast as the Grand Canyon; disparities that grew out of historical failures: insurmountable gulfs in wealth and real opportunity across a population always beset by class and racial inequalities. Indeed, the underclass of modern America is no less stuck within societal ruts than the underclass of the least developed regions on earth, and in relative terms many are worse off. 21 “It’s called the American Dream”, said the late, great satirist George Carlin, “because you have to be asleep to believe it”.

In short, to keep dreaming the American Dream involves an unresting commitment. Its most fervent acolytes live in a perpetually suspended state of ignorance or outright denial; denial of the everyday miseries and cruelties that ordinary Americans daily suffer: the ‘American Reality’.

Graphic from page 56 of Jean Kilbourne’s book Can’t Buy My Love: How Advertising Changes the Way We Think and Feel (originally published in hardcover in 1999 as Deadly Persuasion: ‘Why Women and Girls Must Fight the Addictive Power of Advertising’). It was an ad for a German marketing firm, contained within a decades-old issue of the trade journal ‘Advertising Age’:

3 being humans image -MAGA advert

But just suppose for a moment that the American Dream actually did come true. That America somehow escaped from this lingering malaise and blossomed into a land of real freedom and opportunity for all as it always promised to be. Yet still an unassailable problem remains. For as with every ascent, the higher you reach the more precarious your position becomes: as apes we have never entirely forgotten how branches are thinner and fewest at the top of the tree.

Moreover, built into the American Dream is its emphasis on material enrichment: to rise towards the heavens therefore means riding up and up and always on a mountain of stuff. And, as you rise, others must, in relative terms, fall. Not necessarily because there isn’t enough stuff to go around, but because success depends upon holding ownership of the greatest share. Which means that as the American Reality draws closer to the American Dream (and it could hardly get much further away), creating optimal social mobility and realisable opportunities for all, then even given this best of all circumstances, the rise of some at the expense of others will cultivate anxious winners and a disadvantaged underclass for whom relative material gain of the winners comes at their own cost of bearing the stigma of comparative failure.

Why am I not nearer the top of the tree? In the greatest land on earth, why do I remain subservient to the gilded elites? Worries that nowadays plague the insomniac hours of many a hopeful loser; of those who landed up, to a large extent by accidental circumstance, in the all-too-fixed trailer parks of “The Land of the Free” (yet another sub-meme – ironically linked to the country with the highest incarceration rate on earth).

But worse, there is an inevitable shadow cast by the American Dream: a growing spectre of alienation and narcissism that abounds from such excessive emphasis on individual achievement: feelings of inferiority for those who missed the boat, and superiority, for those who caught the gravy train. Manipulation is celebrated. Machiavellianism, narcissism and psychopathy come to reign. This shadow is part of what we might call the ‘American Dreme’; an unconscious offspring that contains within it a truly abysmal contrast to the American Dream which bore it. A dreme, that being carried upon the coat-tails of the Dream, was spread far and wide by Hollywood, by Disney, radiated out in radio and television transmissions, and in consequence is now becoming the ‘Global Dreme’.

Being unconscious of it, however, we are mostly unaware of any affliction whatsoever; the dreme being insidious, and thus very much more dangerous than the meme. We might even mistake it for something else – having become such a pandemic, we might easily misdiagnose it as a normal part of ‘human nature’.

*

Here is Chris Hedges again with his own analysis of modern day consumerism, totalitarian corporate power and living in a culture dominated by pervasive illusion:

“Working for the American Dream”, first broadcast by the BBC in July 2018 and embedded below, is American comedian Rich Hall’s affectionate though characteristically sardonic portrait of the nation’s foundational and persistent myth:

*

And the joke was hilarious wasn’t it? No, you didn’t like it….? Well, if beauty is in the eye of the beholder, comedy surely lies in the marrow of the funny bone! Which brings me to ask why there is comedy? More broadly, why is there laughter – surely the most curious human reflex of all – or its very closely-related reflex cousin, crying. In fact, the emission of tears from the nasolacrimal ducts other than in response to irritation of our ocular structures and purely for reasons of joy or sorrow is a very nearly uniquely human secretomotor phenomenon. (Excuse my Latin!) 22

The jury is still out on evolutionary function of laughing and crying, but when considered in strictly Darwinian terms (as current Science insists), it is hard to fathom why these dangerously debilitating and a potentially life threatening responses ever developed in any species. It is acknowledged indeed that a handful of unlucky (perhaps lucky?) people have literally died from laughter. So why do we laugh? Why do we love laughter, whether ours or others, so much? Your guess is as good as mine, and, more importantly, as good as Darwin’s:

Many curious discussions have been written on the causes of laughter with grown-up persons. The subject is extremely complex. Something incongruous or unaccountable, exciting surprise and some sense of superiority in the laugher, who must be in a happy frame of mind, seems to be the commonest cause. 23

Less generously, Thomas Hobbes, who explained all human behaviour in terms of gaining social advantage, wrote that:

Joy, arising from imagination of a man’s own power and ability, is that exultation of the mind which is called glorying… Sudden Glory, is the passion which maketh those grimaces called LAUGHTER; and is caused either by some sudden act of their own, that pleaseth them; or by the apprehension of some deformed thing in another, by comparison whereof they suddenly applaud themselves. 24

And indeed, it is true that a great deal of laughter is at the expense of some butt of our joking, however not all mockery involves an inflicted party and there’s a great deal more to humour and laughter than merely ridicule and contempt. So Hobbes’ account is at best a very desiccated postulation for why humans laugh, let alone what constitutes joy.

Indeed, Hobbes’ reductionism is evidently mistaken and misinformed not only by his deep-seated misanthropy, but also by a seeming lack of common insight which leads one to suspect that when it came to sharing any jokes, he just didn’t get it. But precisely what didn’t he get?

Well, apparently he didn’t get how laughter can be a straightforward expression of joie de vivre. Too French I imagine! Or that when we apprehend anything, this momentarily snaps us from a prior state of inattention and on the occasion of and finding amusement in an abrupt, often fleeting, but totally fresh understanding, the revelation itself may elicit laughter (as I already outlined above). Or that it is simply impossible to laugh authentically or infectiously unless you not only understand the joke, but fully acknowledge it. In this way, humour, if confessional, can be liberating at a deeply personal level, or if satirical, liberating at a penetrating societal level. Lastly (in my necessarily limited rundown), humour serves as a wonderfully efficient and entertaining springboard for communicating insight and understanding, especially when the truths are dry, difficult to grasp or otherwise unpalatable. Here is a rhetorical economy that Hobbes might actually have approved were it not for his somewhat curmudgeonly disposition.

And why tell a joke here? Just to make you laugh and take your mind off the gravity of the topics covered and still more grave ones to come? To an extent, yes, but also to broaden out our discussion, letting it drift off into related philosophical avenues. For existence is seemingly absurd, is it not? Considered squarely, full-frontal, what’s it all about…? And jokes – especially ones that work beyond rational understanding – offer a playful recognition of the nonsensicalness of existence and of our species’ farcical determination to comprehend it and ourselves fully. What gives us the gall to ever speculate on the meaning of life, the universe and everything?

Meanwhile, we are free to choose: do we laugh or do we cry at our weird predicament. Both responses are surely sounder than cool insouciance, since both are flushed with blood. And were we madder, we might scream instead, of course, whether in joy or terror. As Theseus says in Shakespeare’s A Midsummer Night’s Dream:

Lovers and madmen have such seething brains,
Such shaping fantasies, that apprehend
More than cool reason ever comprehends.
The lunatic, the lover, and the poet
Are of imagination all compact.

*

French existentialist Albert Camus famously made the claim: “There is but one truly serious philosophical problem and that is suicide.” 25 Camus was not an advocate of suicide, of course; far from it. In fact, he saw it as perfectly vain attempt to flee from the inescapable absurdity of life, something he believed we ought to embrace in order to live authentically. Indeed, Camus regarded every attempt to deny the primacy of ultimate meaninglessness of life in a universe that is indifferent to our suffering as a surrogate form of psychological suicide.

But rather than staring blankly into the abyss, Camus urges us to rebel against it. To face its absurdity without flinching, and through rebellion, by virtue of which we individually reconstruct the meaning of our lives afresh, albeit paradoxically, we shall then come to face extreme rationality. Although perhaps he goes too far, and reaches a point so extreme that few can follow: such a Sisyphean outlook being too desolate for most of us, and his exhortation to authenticity so impassioned that it seems almost infinitely taxing. 26 Kierkegaard’s “leap of faith” is arguably more forgiving of the human condition – but enough of philosophy. 27

This pause is meant for introspection. I have therefore presented an opportunity to reconsider how my interlude set out, not only by telling a joke – and hopefully one that made you smile if not laugh out loud – but also to reflect upon the beautiful wisdom encapsulated in Chuang Tzu’s dream of becoming a butterfly; mystical enlightenment from 4th century BC China, that clashes intentionally with the plain silliness of a doctor-doctor joke about a moth-man; a surreal quip about clinical diagnosis and psychiatry (something I shall be coming to consider next).

However, the running theme here is one of transformation, and at the risk of also killing Chuang Tzu’s message by dissection, I will simply add (unnecessarily from the Daoist perspective) that existence does appear to be cyclically transformative; on personal, collective and altogether cosmic levels, the conscious and unconscious, spiralling outwards – whether upward into light or downward into darkness – each perpetually giving rise to the other just like the everblooming of yang and yin. As maverick clinical psychiatrist R. D. Laing once wrote:

“Most people most of the time experience themselves and others in one way or another that I… call egoic. That is, centrally or peripherally, they experience the world and themselves in terms of a consistent identity, a me-here over against you-there, within a framework of certain ground structures of space and time shared with other members of their society… All religious and all existential philosophies have agreed that such egoic experience is a preliminary illusion, a veil, a film of maya—a dream to Heraclitus, and to Lao Tzu, the fundamental illusion of all Buddhism, a state of sleep, of death, of socially accepted madness, a womb state to which one has to die, from which one has to be born.” 28

Returning from the shadowlands of alienation to contemplate the glinting iridescent radiance of Tzu’s butterfly’s wings is an invitation to scrape away the dross of habituated semi-consciousness that veils the playful mystery of our minds. On a different occasion, Tzu wrote:

One who dreams of drinking wine may in the morning weep; one who dreams weeping may in the morning go out to hunt. During our dreams we do not know we are dreaming. We may even dream of interpreting a dream. Only on waking do we know it was a dream. Only after the great awakening will we realize that this is the great dream. And yet fools think they are awake, presuming to know that they are rulers or herdsmen. How dense! You and Confucius are both dreaming, and I who say you are a dream am also a dream. Such is my tale. It will probably be called preposterous, but after ten thousand generations there may be a great sage who will be able to explain it, a trivial interval equivalent to the passage from morning to night. 29

Thus the world about us is scarcely less a construct of our imagination than our dreams are, deconstructed by the senses then seamlessly reconstructed in its entirety. And not just reconfigured via inputs from the celebrated five gateways of vision, sound, touch, taste and smell, but all portals including those of memory, intuition, and even reason. After all, it is curious how we speak of having ‘a sense’ of reason, just as we do ‘a sense’ of humour. Well, do we have… a sense of reason and a sense of humour? If you have followed this far then I sense you may share my own.

Next chapter…

*

Richard Rohr is a Franciscan priest, author and teacher, who says that his calling has been “to retrieve and re-teach the wisdom that has been lost, ignored or misunderstood in the Judeo-Christian tradition.” Rohr is the founder of the Center for Action and Contemplation and academic dean of the CAC’s Living School, where he practises incarnational mysticism, non-dual consciousness, and contemplation, with a particular emphasis on how these affect the social justice issues of our time. Recently he shared his inspirational standpoint in an hour-long chat with ‘Buddha at the Gas Pump’s Rick Archer:

*

Addendum: anyone with half a brain

“The intuitive mind is a sacred gift and the rational mind is a faithful servant. We have created a society that honors the servant and has forgotten the gift.”

— attributed to Albert Einstein 30

*

The development of split-brain operations for the treatment of severe cases of epilepsy, which involves the severing of the corpus callosum, a thick web of nerves that allow communication between the two hemispheres, first drew attention to how left and right hemispheres have quite different attributes. Unfortunately, the early studies in this field produced erroneous since superficial notions about left and right brain functions that were in turn vulgarised and popularised when they percolated down into pop psychology and management theory. The left brain was said to generate language and logic; while it was only the right brain which supposedly dealt with feelings and was the creative centre. In reality, both hemispheres are involved in all aspects of cognition, and as a consequence the study of what is technically called the lateralisation of brain function fell to some extent into academic disrepute.

In fact, important differences do occur between the specialism of the left and right hemispheres, although as psychiatrist Iain McGilchrist proposes in this book The Master and His Emissary (which he sees as the proper roles of the right and left hemispheres respectively) 31, it is often better to understand the distinctions in terms of where conscious awareness is placed. In summary, the left hemisphere attends to and focuses narrowly but precisely on what is immediately in front of you, allowing you to strike the nail with the hammer, thread the eye of the needle, sort the wheat from the chaff (or whatever activity you might be actively engaged with), while the right hemisphere remains highly vigilant and attentive to the surroundings. Thus, the left brain operates tools and usefully sizes up situations, while the right brain’s immediate relationship to the environment and to our bodies makes it the mediator to social activities and to a far broader conscious awareness. However, according to McGilchrist, the left brain is also convinced of its primacy, whereas the right is incapable of comprehending such hierarchies, which is arguably the root of a problem we all face, since it repeatedly leads humans to construct societal arrangements and norms in accordance with left brain dominance and so to the inevitable detriment of less restricted right brain awareness.

Supported by many decades of research, this has become the informed view of McGilchrist, and given that his overarching thesis has merit – note that the basic distinctions between left and right brain awareness are uncontroversial and well understood in psychology, whereas what he sees as the socio-historical repercussions is more speculative – then it raises brain function lateralisation as major underlying issue that needs to be incorporated in any final appraisal of ‘human nature’, the implications of which McGilchrist propounds at length in his own writing. In the preface to the new expanded edition of The Master and His Emissary (2009), he writes:

I don’t want it to be possible, after reading this book, for any intelligent person ever again to see the right hemisphere as the ‘minor’ hemisphere, as it used to be called – still worse the flighty, impetuous, fantastical one, the unreliable but perhaps fluffy and cuddly one – and the left hemisphere as the solid, dependable, down-to-earth hemisphere, the one that does all the heavy lifting and is alone the intelligent source of our understanding. I might still be to some extent swimming against the current, but there are signs that the current may be changing direction.

*

Embedded below is a lecture given to the Royal Society of Arts (RSA) in 2010, in which he offers a concise overview of how according to our current understanding, the ‘divided brain’ has profoundly altered human behaviour, culture and society:

To hear these ideas contextualised within an evolutionary account of brain laterality, I also recommend a lecture given to The Evolutionary Psychiatry Special Interest Group of the Royal College of Psychiatrists in London (EPSIG UK) in 2018:

For more from Iain McGilchrist I also recommend this extended interview with physicist and filmmaker Curt Jaimungal, host of Theories of Everything, which premiered on March 29th:

*

Please note that for the purposes of ‘publishing’ here I have taken advantage of the option to incorporate hypertext links and embed videos – in order to distinguish additional commentary from the original text all newly incorporated text has been italised.

*

1  “insensible perceptions are as important to [the science of minds, souls, and soul-like substances] as insensible corpuscles are to natural science, and it is just as unreasonable to reject the one as the other on the pretext that they are beyond the reach of our senses.” from Preface of New Essays concerning Human Understanding  by Gottfried Leibnitz, first published in 1704, translation courtesy of Stanford Encyclopedia of Philosophy.

2 From Anthropology from a Pragmatic Point of View by Immanuel Kant, first published in 1798.

3 “The definition of Psychology may be best given… as the description and explanation of states of consciousness as such. By states of consciousness are meant such things as sensations, desires, emotions, cognitions, reasonings, decisions, volitions, and the like. Their ‘explanation’ must of course include the study of their causes, conditions, and immediate consequences, so far as these can be ascertained.” from opening paragraph of “Introduction: Body and Mind” from The Principles of Psychology, by William James, first published in 1892.

4 Extract taken from The Varieties of Religious Experience, from chapter on “The Sick Soul”.

5 Letter to his friend, Francis Child.

6 According to James, the first division of “the self” that can be discriminated is between “the self as known”, the me, and “the  self as knower”, the I, or “pure ego”. The me he then suggests might be sub-divided in a constituent hierarchy: “the material me” at the lowest level, then “the social me” and top-most “the spiritual me”. It was not until very much later in the 1920s when Freud had fully developed his own tripartite division of the psyche in id, ego and super-ego, a division that surely owes much to James.

7

In the spring of 1876, a young man of nineteen arrived in the seaside city of Trieste and set about a curious task. Every morning, as the fishermen brought in their catch, he went to meet them at the port, where he bought eels by the dozens and then the hundreds. He carried them home, to a dissection table in a corner of his room, and—from eight until noon, when he broke for lunch, and then again from one until six, when he quit for the day and went to ogle the women of Trieste on the street—he diligently slashed away, in search of gonads.

“My hands are stained by the white and red blood of the sea creatures,” he wrote to a friend. “All I see when I close my eyes is the shimmering dead tissue, which haunts my dreams, and all I can think about are the big questions, the ones that go hand in hand with testicles and ovaries—the universal, pivotal questions.”

The young man, whose name was Sigmund Freud, eventually followed his evolving questions in other directions. But in Trieste, elbow-deep in slime, he hoped to be the first person to find what men of science had been seeking for thousands of years: the testicles of an eel. To see them would be to begin to solve a profound mystery, one that had stumped Aristotle and countless successors throughout the history of natural science: Where do eels come from?

From an article entitled “Where Do Eels Come From?” written by Brooke Jarvis, published in New Yorker magazine on May 18, 2020. https://www.newyorker.com/magazine/2020/05/25/where-do-eels-come-from

8 In the BBC TV sci-fi comedy Red Dwarf (Series 1 Episode), the eponymous characters “Confidence and Paranoia” form an alternative superego-id partnership, existing as physical manifestations, which appear onboard as symptoms of Lister’s illness.

9 Fixing on specific erogenous zones of the body, Freud believed that libidinous desire shaped our psychological development in a very specific fashion, naturally progressing, if permitted, through early stages from oral, to anal, and, then reaching adulthood, to genital.

10 Jocasta, the queen of Thebes, is barren, and so she and her husband, the king Laius, decide to consult the Oracle of Delphi. The Oracle tells them that if Jocasta bears a son, then the son will kill his father and marry her. Later, when Jocasta does indeed have a son, Laius demands that a servant take the baby to a mountain to be abandoned, his ankles pinned together just in case. But Oracles are rarely mistaken, fate is hard to avoid, and so as it happens the servant spares the infant, giving him to a shepherd instead. Eventually, as fortune will have it, the infant is adopted by the king and queen of Corinth, and named Oedipus because of the swellings on his feet. Years pass. Then, one day Oedipus learns that the king and queen are not his parents, but when he asks them, they deny the truth. So Oedipus decides put the question to the Oracle of Delphi instead, who being an enigmatic type, refuses to identify his true parents, but foretells his future instead, saying that he is destined to kill his father and marry his mother. Determined to avoid this, Oedipus determines not to return home to Corinth, heading to, you guessed it, Thebes instead. He comes to an intersection of three roads and meets Laius driving a chariot. They argue about who has the right of way and then, in an early example of road rage, their rage spills into a fight and thus Oedipus unwittingly kills his real father. Next up, he meets the sphinx, who asks its famous riddle. This is a question of life and death, all who have incorrectly answered having been killed and eaten, but Oedipus gets the answer right and so obligingly the sphinx kills itself instead. Having freed the people of Thebes from the sphinx, Oedipus receives the hand of the recently widowed Jocasta in marriage. All is well for a while, but then it comes to pass that Jocasta learns who Oedipus really is, and hangs herself. Then, later again, Oedipus discovers that he was the murderer of his own father, and gouges his own eyes out.

11 Sigmund Freud, The Interpretation of Dreams, chapter V, “The Material and Sources of Dreams”

12 From an essay by C.G. Jung published in CW XI, Para 520. The word ‘Raca’ is an insult translated as ‘worthless’ or ‘empty’ taken from a passage in the Sermon on the Mount from Matthew 5:22.

13 Jung described the shadow in a key passage as “that hidden, repressed, for the most part inferior and guilt-laden personality whose ultimate ramifications reach back into the realm of our animal ancestors…If it has been believed hitherto that the human shadow was the source of evil, it can now be ascertained on closer investigation that the unconscious man, that is his shadow does not consist only of morally reprehensible tendencies, but also displays a number of good qualities, such as normal instincts, appropriate reactions, realistic insights, creative impulses etc”

From Jung’s Collected Works, 9, part 2, paragraph 422–3.

14 In response to a question in an interview completed just two years before his death by John Freeman and broadcast as part of the BBC Face to Face TV series in 1959. Asking about his childhood and whether he had to attend church, he then asked: “Do you now believe in God?” Jung replies: “Now? Difficult to answer… I know. I don’t need to believe I know.”

15 The quote in full reads: “This is the only story of mine whose moral I know. I don’t think it’s a marvelous moral, I just happen to know what it is: We are what we pretend to be, so we must be careful about what we pretend to be.” From Mother Night (1962) by Kurt Vonnegut.

16 The Human Situation is a collection of lectures first delivered by Aldous Huxley at the University of California in 1959. These were edited by Piero Ferrucci and first published in 1978 by Chatto & Windus, London. Both extracts here were taken from his lecture on “Language”, p 172.

17 This is the premise behind Orwell’s ‘Newspeak’ used in his dystopian novel Nineteen Eighty-Four. In Chapter 5, Syme, a language specialist and one of Winston Smith’s colleagues at the Ministry of Truth, explains enthusiastically to Winston:

“Don’t you see that the whole aim of Newspeak is to narrow the range of thought? In the end we shall make thoughtcrime literally impossible, because there will be no words in which to express it. Every concept that can ever be needed, will be expressed by exactly one word, with its meaning rigidly defined and all its subsidiary meanings rubbed out and forgotten.”

18 I should note that the idea proposed here is not altogether original and that the original concept of ‘linguistic relativity’ is jointly credited to linguists Edward Sapir and Benjamin Whorf who whilst working independently came to the parallel conclusion that (in the strong form) language determines thought or (in the weak form) language and its usage influences thought. Whorf also inadvertently created the urban myth that Eskimos have hundred words for snow after he wrote in a popular article “We [English speakers] have the same word for falling snow, snow on the ground, snow hard packed like ice, slushy snow, wind-driven snow – whatever the situation may be. To an Eskimo, this all-inclusive word would be almost unthinkable…” The so-called “Sapir-Whorf hypothesis” continues to inspire research in psychology, anthropology and philosophy.

19 After writing this, I then read Richard Dawkins The Ancestor’s Tale. Aside from being a most wonderful account of what Dawkins poetically describes as his ‘pilgrimage to the dawn of life’, here Dawkins also returns to many earlier themes of other books, occasionally moderating or further elucidating previous thoughts and ideas. In chapter entitled ‘the peacock’s tale’ [pp 278–280], he returns to speculate more about the role memes may have had on human development. In doing so he presents an idea put forward by his friend, the philosopher Daniel Dennett,  from his book “Consciousness Explained”, which is that local variation of memes is inevitable:

“The haven all memes depend on reaching is the human mind, but the human mind is itself an artifact created when memes restructure a human brain in order to make it a better habitat for memes. The avenues for entry and departure are modified to suit local conditions, and strengthened by various artificial devices that enhance fidelity and prolixity of replication: native Chinese minds differ dramatically from native French minds, and literate minds differ from illiterate minds.” And is it not also implicit here, that the unconscious brain will also be differently ‘restructured’ due to different environmental influences.

20 Barack Obama, who’s own election was acclaimed by some and witnessed by many as proof of the American Dream, recently compared E pluribus unuman Indonisian motto Bhinneka Tunggal Ika — unity in diversity.

“But I believe that the history of both America and Indonesia should give us hope. It is a story written into our national mottos. In the United States, our motto is E pluribus unum — out of many, one. Bhinneka Tunggal Ika — unity in diversity. (Applause.) We are two nations, which have traveled different paths. Yet our nations show that hundreds of millions who hold different beliefs can be united in freedom under one flag.” Press release (unedited) from The White House, posted November 10th, 2010: “remarks by the President at the University of Indonesia in Jakarta, Indonesia”

21 Summary of statistical analysis by the Center for American Progress, “Understanding Mobility in America”, by Tom Hertz, American University, published April 26th, 2006. Amongst the key findings was a discovery that “Children from low-income families have only a 1 percent chance of reaching the top 5 percent of the income distribution, versus children of the rich who have about a 22 percent chance [of remaining rich].” and that “By international standards, the United States has an unusually low level of intergenerational mobility: our parents’ income is highly predictive of our income as adults.” The report adds that “Intergenerational mobility in the United States is lower than in France, Germany, Sweden, Canada, Finland, Norway and Denmark. Among high-income countries for which comparable estimates are available, only the United Kingdom had a lower rate of mobility than the United States.”

Reproduced from an article entitled “Advertising vs. Democracy: An Interview with Jean Kilbourne” written by Hugh Iglarsh, published in Counterpunch magazine on October 23rd 2020. https://www.counterpunch.org/2020/10/23/advertising-vs-democracy-an-interview-with-jean-kilbourne/ 

22 In his follow-up to the more famous On the Origin of Species (1959) and The Descent of Man (1871), Charles Darwin reported in Chapter VI entitled “Special Expressions of Man: Suffering and Weeping” of his three major work The Expression of the Emotions in Man and Animals (1872), that:

I was anxious to ascertain whether there existed in any of the lower animals a similar relation between the contraction of the orbicular muscles during violent expiration and the secretion of tears; but there are very few animals which contract these muscles in a prolonged manner, or which shed tears. The Macacus maurus, which formerly wept so copiously in the Zoological Gardens, would have been a fine case for observation; but the two monkeys now there, and which are believed to belong to the same species, do not weep. Nevertheless they were carefully observed by Mr. Bartlett and myself, whilst screaming loudly, and they seemed to contract these muscles; but they moved about their cages so rapidly, that it was difficult to observe with certainty. No other monkey, as far as I have been able to ascertain, contracts its orbicular muscles whilst screaming.

The Indian elephant is known sometimes to weep. Sir E. Tennent, in describing these which he saw captured and bound in Ceylon, says, some “lay motionless on the ground, with no other indication of suffering than the tears which suffused their eyes and flowed incessantly.” Speaking of another elephant he says, “When overpowered and made fast, his grief was most affecting; his violence sank to utter prostration, and he lay on the ground, uttering choking cries, with tears trickling down his cheeks.” In the Zoological Gardens the keeper of the Indian elephants positively asserts that he has several times seen tears rolling down the face of the old female, when distressed by the removal of the young one. Hence I was extremely anxious to ascertain, as an extension of the relation between the contraction of the orbicular muscles and the shedding of tears in man, whether elephants when screaming or trumpeting loudly contract these muscles. At Mr. Bartlett’s desire the keeper ordered the old and the young elephant to trumpet; and we repeatedly saw in both animals that, just as the trumpeting began, the orbicular muscles, especially the lower ones, were distinctly contracted. On a subsequent occasion the keeper made the old elephant trumpet much more loudly, and invariably both the upper and lower orbicular muscles were strongly contracted, and now in an equal degree. It is a singular fact that the African elephant, which, however, is so different from the Indian species that it is placed by some naturalists in a distinct sub-genus, when made on two occasions to trumpet loudly, exhibited no trace of the contraction of the orbicular muscles.

The full text is uploaded here: https://www.gutenberg.org/files/1227/1227-h/1227-h.htm#link2HCH0006

23 Quote from The Expression of the Emotions in Man and Animals (1872), Chapter VIII “Joy, High Spirits, Love, Tender Feelings, Devotion” by Charles Darwin. He continues:

The circumstances must not be of a momentous nature: no poor man would laugh or smile on suddenly hearing that a large fortune had been bequeathed to him. If the mind is strongly excited by pleasurable feelings, and any little unexpected event or thought occurs, then, as Mr. Herbert Spencer remarks, “a large amount of nervous energy, instead of being allowed to expend itself in producing an equivalent amount of the new thoughts and emotion which were nascent, is suddenly checked in its flow.” . . . “The excess must discharge itself in some other direction, and there results an efflux through the motor nerves to various classes of the muscles, producing the half-convulsive actions we term laughter.” An observation, bearing on this point, was made by a correspondent during the recent siege of Paris, namely, that the German soldiers, after strong excitement from exposure to extreme danger, were particularly apt to burst out into loud laughter at the smallest joke. So again when young children are just beginning to cry, an unexpected event will sometimes suddenly turn their crying into laughter, which apparently serves equally well to expend their superfluous nervous energy.

The imagination is sometimes said to be tickled by a ludicrous idea; and this so-called tickling of the mind is curiously analogous with that of the body. Every one knows how immoderately children laugh, and how their whole bodies are convulsed when they are tickled. The anthropoid apes, as we have seen, likewise utter a reiterated sound, corresponding with our laughter, when they are tickled, especially under the armpits… Yet laughter from a ludicrous idea, though involuntary, cannot be called a strictly reflex action. In this case, and in that of laughter from being tickled, the mind must be in a pleasurable condition; a young child, if tickled by a strange man, would scream from fear…. From the fact that a child can hardly tickle itself, or in a much less degree than when tickled by another  person, it seems that the precise point to be touched must not be known; so with the mind, something unexpected – a novel or incongruous idea which breaks through an habitual train of thought – appears to be a strong element in the ludicrous.

24 Quote from, Leviathan (1651), The First Part, Chapter 6, by Thomas Hobbes (with italics and spelling as original). Hobbes continues:

And it is incident most to them, that are conscious of the fewest abilities in themselves; who are forced to keep themselves in their own favour, by observing the imperfections of other men. And therefore much Laughter at the defects of others is a signe of Pusillanimity. For of great minds, one of the proper workes is, to help and free others from scorn; and compare themselves onely with the most able.

Interestingly, Hobbes then immediately offers his account of weeping as follows:

On the contrary, Sudden Dejection is the passion that causeth WEEPING; and is caused by such accidents, as suddenly take away some vehement hope, or some prop of their power: and they are most subject to it, that rely principally on helps externall, such as are Women, and Children. Therefore, some Weep for the loss of Friends; Others for their unkindnesse; others for the sudden stop made to their thoughts of revenge, by Reconciliation. But in all cases, both Laughter and Weeping, are sudden motions; Custome taking them both away. For no man Laughs at old jests; or Weeps for an old calamity.

https://www.gutenberg.org/files/3207/3207-h/3207-h.htm#link2H_PART1

25 “Il n’y a qu’un problème philosophique vraiment sérieux : c’est le suicide.” Quote taken from The Myth of Sisyphus (1942) by Camus, Albert.  Translated by Justin O’Brien.

26 In Greek mythology Sisyphus was punished in hell by being forced to roll a huge boulder up a hill only for it to roll down every time, repeating his action for eternity. In his philosophical essay The Myth of Sisyphus (1942) Camus compares this unremitting and unrewarding task of Sisyphus to the lives of ordinary people in the modern world, writing:

“The workman of today works every day in his life at the same tasks, and this fate is no less absurd. But it is tragic only at the rare moments when it becomes conscious.”

In sympathy he also muses on Sisyphus’ thoughts especially as he trudges in despair back down the mountain to collect the rock again. He writes:

“You have already grasped that Sisyphus is the absurd hero. He is, as much through his passions as through his torture. His scorn of the gods, his hatred of death, and his passion for life won him that unspeakable penalty in which the whole being is exerted toward accomplishing nothing. This is the price that must be paid for the passions of this earth. Nothing is told us about Sisyphus in the underworld. Myths are made for the imagination to breathe life into them.”

Continuing:

“It is during that return, that pause, that Sisyphus interests me. A face that toils so close to stones is already stone itself! I see that man going back down with a heavy yet measured step toward the torment of which he will never know the end. That hour like a breathing-space which returns as surely as his suffering, that is the hour of consciousness. At each of those moments when he leaves the heights and gradually sinks toward the lairs of the gods, he is superior to his fate. He is stronger than his rock.

“If this myth is tragic, that is because its hero is conscious. Where would his torture be, indeed, if at every step the hope of succeeding upheld him? The workman of today works everyday in his life at the same tasks, and his fate is no less absurd. But it is tragic only at the rare moments when it becomes conscious. Sisyphus, proletarian of the gods, powerless and rebellious, knows the whole extent of his wretched condition: it is what he thinks of during his descent. The lucidity that was to constitute his torture at the same time crowns his victory. There is no fate that can not be surmounted by scorn.”

You can read the extended passage here: http://dbanach.com/sisyphus.htm

27 Søren Kierkegaard never actually coined the term “leap of faith” although he did use the more general notion of “leap” to describe situations whenever a person is faced with a choice that cannot be fully justified rationally. Moreover, in this instance the “leap” is perhaps better described as a leap “towards” or “into” faith that finally overcomes what Kierkegaard saw as an inherent paradoxical contradiction between the ethical and the religious. However, Kierkegaard never advocates “blind faith”, but instead recognises that faith ultimately calls for action in the face of absurdity.

In Part Two, “The Subjective Issue”, of his 1846 work and impassioned attack against Hegelianism, Concluding Unscientific Postscript to the Philosophical Fragments (Danish: Afsluttende uvidenskabelig Efterskrift til de philosophiske Smuler), which is known for its dictum, “Subjectivity is Truth”, Kierkegaard wrote:

“When someone is to leap he must certainly do it alone and also be alone in properly understanding that it is an impossibility… the leap is the decision… I am charging the individual in question with not willing to stop the infinity of [self-]reflection. Am I requiring something of him, then? But on the other hand, in a genuinely speculative way, I assume that reflection stops of its own accord. Why, then, do I require something of him? And what do I require of him? I require a resolution.”

28 R. D. Laing, The Politics of Experience  (Ballantine Books, N.Y., 1967)

29 Quoted from the book known as Zhuangzi (also transliterated as Chuang Tzu or Chuang Chou). Translation by Lin Yutang

30 Although in all likelihood a reworking of a passage from a book entitled The Metaphoric Mind: A Celebration of Creative Consciousness written by Bob Samples and published in 1976 in which the fuller passage reads [with emphasis added]:

“The metaphoric mind is a maverick. It is as wild and unruly as a child. It follows us doggedly and plagues us with its presence as we wander the contrived corridors of rationality. It is a metaphoric link with the unknown called religion that causes us to build cathedrals — and the very cathedrals are built with rational, logical plans. When some personal crisis or the bewildering chaos of everyday life closes in on us, we often rush to worship the rationally-planned cathedral and ignore the religion. Albert Einstein called the intuitive or metaphoric mind a sacred gift. He added that the rational mind was a faithful servant. It is paradoxical that in the context of modern life we have begun to worship the servant and defile the divine.

31 The book is subtitled The Divided Brain and the Making of the Western World

4 Comments

Filed under « finishing the rat race »

the united colours of Bilderberg — a late review of Montreux 2019: #1 status quo warriors

This is the first of a sequence of articles based around the ‘key topics’ to this year’s Bilderberg conference discussed in relation to the prevailing political agenda and placed within the immediate historical context.

*

Smoke on the water

We all came out to Montreux, on the Lake Geneva shoreline
To make records with a mobile, we didn’t have much time

— Deep Purple 1

Is it any exaggeration to say that western civilization is in the midst of an existential crisis? No longer tethered by the old sturdy belief in post-Enlightenment progress, at best we seem to be drifting aimlessly, and at worst, lost at sea and beginning to take on water.

Amongst the young especially, a common view has developed that we are living through a uniquely historical moment. The quickening sense that unless the current socioeconomic course can be abruptly diverted, not just the human species, but the biosphere as a whole, will be dashed to pieces as together we plunge into a vortex of our own making. Some prospect of an environmental catastrophe on a truly planetary scale is now top of many people’s concerns, and understandably therefore, a commensurately international environmental resistance movement has been actuated. A few are even asking whether we need a global dictatorship to solve the environmental problems of the twenty-first century. Of course, we should always careful what we wish for!

The gross flaws inherent in our prevailing neoliberal orthodoxy present us with a still more immediate and thus more daunting threat. Vast disparities of wealth and income have been rupturing our societies as the impact of perpetual “austerity” impoverishes millions and spreads untold misery. Inequalities that have lain partially dormant during the decade since the last crash are now beginning to feed an upcoming breed of far-right demagogues and more overt fascists. But the political centre cannot hold for a reason: by adopting right-wing economic policies, it too became virulently extreme. In fact, the measures that brought us to a crisis point remain wholly endorsed by today’s extreme “centrists” perhaps best exemplified by French President Emmanuel Macron.

Finally, a less spoken-of, if occasionally numbing dread, is felt somewhere in the back of all our minds, as Nato powers drag the world unsteadily into the era of a new Cold War and we once again glimpse the unfathomable absurdity of nuclear obliteration. Oddly, this time around, the unspeakable apparitions of apocalyptic doom seen glinting occasionally across Mike Pompeo’s sociopathic gaze, or else blurted spasmodically in the nocturnal delirium of Trump’s presidency-by-Twitter, seldom shock us because we have all but forgotten how to be more seriously afraid. Our conscious minds are so thoroughly distracted whether by the material consumerism of our nonstop Black Friday (in societies that know nothing about thanksgiving) or the more ethereal dopamine rewards of social media, whilst abandoned and denied, yet still lurking unconscious, is a kind of clammy white vertigo of impossible horrors.

On September 28th, Chris Hedges spoke on his RT show “On Contact” with fellow journalist Stephen Kinzer about efforts by Riyadh and Washington to cripple Iran’s economy, inevitably putting Saudi Arabia, its Gulf allies and Washington on a collision course with the Islamic republic that could end in war:

When Bush and Blair were about to deliver their “shock and awe” bloodbath to capture the non-existent weapons of mass destruction operated by Saddam, in London alone two million gathered on the streets to shout truth to power. The antiwar message was loud and clear. How many will gather with placards if Trump and Johnson now decide to send our forces to bring down Iran? The marginalisation of the antiwar movement very much in the midst of the 21st Century’s war without end, with its frontline stretching through the Middle East, Central Asia and more insidiously spreading across Africa, is another disturbing trend.

For these and other reasons, the call for sweeping changes is on the rise in many quarters, and who can deny that western civilisation is in need of swift and sweeping transformation? The old capitalist system is dying, and the elites, the establishment, the globalists (alternative labels for the class of oligarchs who carelessly own and exploit more than half the planet and its “resources”) understand this better than anyone. After all, potentially at least, they stand to lose most in its demise. As the Guardian’s token Bilderberg correspondent Charlie Skelton observed sardonically reporting from this year’s conference in Montreux:

A crisis is looming for Bilderberg, and not merely because of the rise in anti-globalization movements and a creeping loss of faith in the EU project. It’s a crisis of leadership. With the Brexit, Frexit, Grexit and even Polexit dominoes threatening to fall, Bilderberg needs to gird its loins for the long haul if it wants the transatlantic alliance to thrive and its beloved EU to survive. But who’s going to be doing the girding?

The problem Bilderberg faces is a loss of quality, of intellectual backbone. With David Rockefeller tucked away since 2017 in his cryogenic pod, and Henry Kissinger knocking on hell’s door, you realize that Bilderberg is facing a generational crisis. You might not like or admire Henry Kissinger, you might want him strung up for war crimes, but you have to admit he’s a heavyweight statesman and historian. He’s a psychopath with vision. Where will Bilderberg find the serious ideologues to lead them into the 2020s? 2

Click here to read Skelton’s full article published by Newsweek.

*

Non-violent totalitari­anism

“By means of ever more effective methods of mind-manip­ulation, the democracies will change their nature; the quaint old forms— elections, parliaments, Supreme Courts and all the rest—will remain. The underlying substance will be a new kind of non-violent totalitari­anism. All the traditional names, all the hallowed slo­gans will remain exactly what they were in the good old days. Democracy and freedom will be the theme of every broadcast and editorial—but democracy and free­dom in a strictly Pickwickian sense. Meanwhile the ruling oligarchy and its highly trained elite of sol­diers, policemen, thought-manufacturers and mind-manipulators will quietly run the show as they see fit.” — Aldous Huxley 3

*

Born in Boston in 1910, Carroll Quigley read history at Harvard University, afterwards going on to teach history, first at Princeton, before returning to Harvard to lecture in Government, History and Politics. Later again, he moved to Georgetown University, where he became one of its most eminent professors. 4 But there were also other strings to Quigley’s prodigious bow.

Quigley had worked for the House Select Committee on Astronautics and Space Exploration. He became a consultant for the Navy, advising on the development of weapons systems. He had even advised the Smithsonian Institution on the layout of their Museum of Science and Technology.

An exceptional polymath, Quigley was respected and influential. Bill Clinton famously singled him out for special mention during his acceptance speech to the 1992 Democratic National Convention, saying:

“As a teenager, I heard John Kennedy’s summons to citizenship. And then, as a student at Georgetown, I heard that call clarified by a professor named Carroll Quigley, who said to us that America was the greatest Nation in history because our people had always believed in two things – that tomorrow can be better than today and that every one of us has a personal moral responsibility to make it so.” 5

In 1966, Quigley wrote a remarkable if little known book. Entitled Tragedy and Hope: A History of the World in our Time, it recounts the central role played by Cecil Rhodes, English imperialist and founder of the De Beers diamond company (which had at the time a virtual monopoly in the diamond mining industry) and the societies and associations established by Rhodes – the so-called Round Table Groups – extending influence and bringing to fruition his and others’ ambitions for expanding the British Empire. 6

“The Round Table Groups”, Quigley explains, “were semi-secret discussion and lobbying groups whose original purpose was to federate the English-speaking world along lines laid down by Cecil Rhodes.” 7 To what political ends? Quigley is quite clear: irrespective of what the John Birch Society afterwards claimed, this was very far from a communist plot:

“…there is no evidence of which I am aware of any explicit plot or conspiracy to direct American policy in a direction favorable either to the Soviet Union or to international Communism.” 8 In fact, Quigley unequivocally dismisses all theories of a communist conspiracy as a “Radical Right fairytale”; before he goes on to make his more important and eye-opening assertion:

“There does exist, and has existed for a generation, an international Anglophile network which operates, to some extent, in the way the radical Right believes the Communists act. In fact, this network, which we may identify as the Round Table Groups, has no aversion to cooperating with the Communists, or any other groups, and frequently does so. I know of the operations of this network because I have studied it for twenty years and was permitted for two years, in the early 1960’s, to examine its papers and secret records. I have no aversion to it or to most of its aims and have, for much of my life, been close to it and to many of its instruments. I have objected, both in the past and recently, to a few of its policies (notably to its belief that England was an Atlantic rather than a European Power and must be allied, or even federated, with the United States and must remain isolated from Europe), but in general my chief difference of opinion is that it wishes to remain unknown, and I believe its role in history is significant enough to be known.” 9

The part of this network which Quigley says he had the greatest access to was the Council on Foreign Relations (CFR). Founded in 1921, as “a nonpartisan and independent membership organization”, Quigley tells us that it was actually set up as “a front for J.P. Morgan and Company in association with then very small American Round Table Group”, and that by 1928, “the Council on Foreign Relations was dominated by the associates of the Morgan bank.” 10 Indeed, Quigley later informs us that funds for all these Round Table activities came primarily from Cecil Rhodes himself, alongside J.P. Morgan, the Rockefeller and Whitney families and associates of bankers Lazard Brothers and Morgan, Grenfell and Company. Apparently their design was to be a grand one:

“The powers of financial capitalism had another far-reaching aim, nothing less than to create a world system of financial control in private hands able to dominate the political system of each country and the economy of the world as a whole. This system was to be controlled in a feudalist fashion by the central banks of the world acting in concert, by secret agreements arrived at in frequent private meetings and conferences. The apex of the system was to be the Bank for International Settlements in Basle, Switzerland 11, a private bank owned and controlled by the world’s central banks which were themselves private corporations. Each central bank sought to dominate its government by its ability to control Treasury loans, to manipulate foreign exchanges, to influence the level of economic activity in the country, and to influence cooperative politicians by subsequent economic rewards in the business world.” 12

A world controlled by international banking interests – who would have thought so? A world of “cooperative politicians” coerced to do their bidding by offers of “subsequent economic rewards in the business world” – oh, come on now… is there even a shred of evidence?

*

Bilderberg is a key part of an extensive network of loosely affiliated private groups, institutes, ‘think tanks’ and other organisations that include, in descending order of secrecy, the Trilateral Commission, the US Council on Foreign Relations, its UK cousin the Royal Institute of International Affairs (better known as Chatham House), and not forgetting the World Economic Forum in Davos.

Bilderberg is arguably the most prestigious and is certainly the most “private” of all these.

It is the place (so far as we know) where our own class of oligarchs, those we might usefully distinguish as Atlanticists (plutocrats in the Anglo-American sphere and those who serve them), meet annually to discuss business and to make arrangements with their political go-betweens. This is all done in strict adherence to Chatham House Rules which means we can never know for sure who said what to whom, and thus importantly, who was receiving instructions and who was giving them. We do however know that Bilderberg isn’t managed according to egalitarian principles, and no great leap of imagination is needed to recognise the entrenched internal hierarchy with its top-down steering committee to decide the agenda, topped again – we learn this year – by a managerial board: in effect this is Bilderberg Inc. Quelle surprise.

There are many reasons why Bilderberg operates in darkness, but the semi-official one is that the delegates hide out to avoid the prying gaze of public attention, i.e., they don’t want to have the likes of us looking over their shoulders when they are in the process of trying to run things. In fact this repeated assertion is hardly worthy of doubting.  That ‘the great and the good’ of Bilderberg are the best and most worthy leaders is perfectly self-evident – how else did they rise to such prominence if not because of their exceptional calibre? It follows as a matter of course that they eschew, as they see it, the incompetent meddling of the public.

Its (reliably incomplete) list of participants also provides insight into Bilderberg’s political leanings and this year was interesting not just for inviting representatives from both sides of the mainstream political aisle (the usual practice in fact), but with the more surprising appearance of a representative for the Greens: the attendance of Dutch MP Kathalijne Buitenweg was indeed a novelty.

I have highlighted this bringing into the fold of a Green MP because it is revealing. Not only should it challenge a widely held opinion that the greens are inherently anti-establishment, but it also shines light on the peculiar nature of Bilderberg, which aims always to cover all available political bases, and thus perennially invites a mix of individuals feigning to be conservatives and progressives when Bilderberg is by its peculiar nature neither conservative nor progressive, but a phoney amalgam – so we need another word: I tentatively propose “congressive”.

I need to expand on this point a little. Bilderberg is not strictly conservative due to its efforts to keep ahead of the curve, proactively (a horrible word too, but an equally appropriate one) guiding and railroading future advancements under its broad remit to concentrate and centralise existing power. It is this forward-looking, and in some respects pioneering outlook – which is seldom if ever progressive in any recognisably leftist sense – that helps to preserve the status quo; a feature of Bilderberg that is readily apparent once we consider their annual agenda, and especially this year’s list of ‘key topics’. Here’s my own schematically enhanced version:

Notice first how many of the listed items completely transcend the everyday concerns of the industrialists, defence contractors, financiers and bankers, heads of intelligence, and military top brass who make up its main contingent.

Why are they even discussing “the ethics of artificial intelligence” or “the importance of space”? In fact, in both cases another cursory glance down the list of participants elucidates one of the likely reasons…

That’s Matthew Daniels, Technical Director for Machine Learning and AI Office of the Under Secretary of Defense for Research & Engineering at US Department of Defense having a good old natter with Patrice Caine, Chairman and CEO of Thales Group, the French multinational that designs and builds electrical systems and provides services for aerospace, defence and security.

And here’s Admiral (Ret) James Ellis, former Commander of US Strategic Command and current Director of Lockheed Martin leading the way for Jānis Sārts, the Director of NATO Strategic Communications Centre of Excellence. Can you feel the breeze from those revolving doors?

And there is a second reason to welcome the Greens into the Bilderberg fold. It is arguably the most brilliant ruse of the ruling class: the ability to maintain the illusion of electoral free choice. A ploy I first understood during a spell I spent in retailing: that expanding product lines reliably boosts overall sales and turnover. The same is true when it comes to political choice: by giving the impression of a greater variety of political alternatives, public interest is maintained and electoral turnout is bolstered, all of which serves to maintain the semblance of democracy.

But it is a difficult process, of course, to manufacture political alternatives out of whole cloth. Successful newcomers such as Macron’s neoliberal relaunch under the Vichyesque banner of En Marche! tend to be the exceptions – incidentally, Macron attended Bilderberg in 2014 and became President of France in 2017 (one of many Bilderberg success stories!).

Other comparative newcomers include the quick-to-sell-out Syriza coalition in Greece; Spain’s more honourable leftist alternative Podemos; and its Machiavellian centre-right adversary Ciudadanos (Citizens), whose leader Inés Arrimadas joined President of mainstream conservatives Partido Popular, Pablo Casado, at this year’s Bilderberg conference.

Italy’s noxious, if more enigmatic, Movimento 5 Stelle (Five Star Movement); and Nigel Farage’s opportunistic Brexit Party are also notable exceptions to the rule. And of all these new political players, Ciudadanos and En Marche! are unusual in that they receive annual invitations to Bilderberg. 13

Contrast these few successes with the more typical death spiral perhaps best epitomised by already defunct wannabe centrists Change UK and it becomes clear how prefabricated parties only seldom succeed. Instead, the promotion of special interests is more reliably achieved through the capture of established political parties, as well as the infiltration of grassroots movements.

In Britain for instance, takeover of the Labour Party was negotiated by Peter Mandelson (a Bilderberg grandee), its rebranding achieved under Neil Kinnock’s leadership, and the hijacking thereafter concluded under Blair (another Bilderberg attendee). Corbyn’s attempt to undo this process is hampered at every step by the same Blairites who having seized control of the party machinery, remain ensconced at all levels beyond the rank and file of ordinary membership. Here is Labour peer, Lord Adonis, the consummate Blairite, speaking live on an LBC radio phone-in last September, encouraging the British public not to vote Labour:

And here is Lord Adonis enjoying his minibreak in Montreux:

Outstanding amongst this year’s crop of nominally leftist progressives is Stacey Abrams, a former member of the Council on Foreign Relations, who as then (at the time of Bilderberg) was being touted to run as a Democrat presidential candidate in 2020. 14 Mary Kay Henry, the International President of Service Employees International Union, was another other of this year’s cohort who must account for her glaring conflict of interests, about which she instead prefers to remain tight-lipped [from 15:15 mins]:

So where is this leading? Democracy is always a moveable term. On the one hand it has become more or less synonymous with mere electoral procedure, and on the other, it is held up as a shining western standard, especially so when it comes to imposing American-led versions of it throughout the world. American democracy – well, that’s one word in case you didn’t know!

Trojan Liberty by Anthony Freda

Thus democracy as we authorise it, can also be gauged negatively. Any government acting against the interests of western – specifically US foreign policy – will be singled out and rebranded “a regime” irrespective of the transparency and rigour of its electoral procedures. These days the West alone is sanctioned to decide on who is and isn’t “a dictator”. Yet even when we judge in accordance with its own definition of the word, the United States provides military assistance to more than 70% of the world’s dictatorships – what better measure of double standards and flagrant hypocrisy?

Empire of Chaos – Liberty Bomb by Anthony Freda

In short, whether the determination of democracy is applied to foreign or domestic governments there is never recourse to the neat definition proffered in the Gettysburg Address: “government of the people, by the people, for the people.” Why? Because set against this measureable benchmark of popular sovereignty, there never has been true democracy, whether in America or elsewhere. No government has ever served the common interests of the people. As French democratic socialist Jean-Luc Mélenchon reminded us in a recent interview:

“[T]he French Socialist Jean Jaurès once said, the only question posed in politics is that of the people’s sovereignty. All the rest depends of that, including the question of the distribution of wealth, for this is a matter of reasserting democracy.” 15

Indeed, if we are to take Lincoln’s words seriously and judge historically, there have only ever been better or worse regimes that asymptotically approach or recede from his laudable ideal of true democracy.

Bilderberg is a thoroughly anti-democratic entity, of course, whose operation seeks to gnaw away at structures and institutions that serve true democratic interests. It ought to go entirely without saying that Bilderberg doesn’t gather in tight secrecy to serve the public good, so why am I saying it again? Because the media owners, newspaper editors and other senior staff, whose crucial role ought to hold corporations, institutions and politicians to democratic account, have been instead lodged inside Bilderberg for decades and have chosen to function as its willing agents.

Perhaps the most lurid example of the cronyism at this year’s meeting was the surprise appearance (to some) of Trump’s son-in-law Jared Kushner. Kushner’s appointment to Trump’s cabinet is an instant measure of how undemocratic US politics has become:

Today the tide of democracy is receding again and as it recedes so too do our individual freedoms. Restrictions on free speech and free assembly were first tightened by the terrorism bills introduced after 9/11, but the chilling effect of total surveillance is more insidious, as is the clampdown on alternative voices by virtue of deplatforming, shadow bans, and algorithmic discrimination, all of which is carried out by the tech giants who dominate proceedings at Bilderberg.

James Corbett on how tech giants like Google envision the search engine of the future:

The steady militarisation of policing provides a further means for quelling popular resistance, as evidenced by the brutal suppression of the French Gilets Jaunes movement (read more here).

Free Speech Zone by Anthony Freda

At another level, all democracies are highly vulnerable to infiltration within existing parties and through the hollowing out of extant political institutions; a process nearing its completion in America and ongoing throughout Western Europe. In all instances, of course, our political systems have managed to retain the outward appearance of democracies. In other ways too, they satisfy the democratic duck test: first and foremost, they still quack like democracies, and are belligerent in quacking that this is the only way to be a democracy. Meanwhile, the power to take decisions that is ostensibly placed in the hands of our elected representatives, shifts incrementally to the technocrats – the appointed experts. This is the preferred end game for the bigwigs at Bilderberg.

In the interim, and faced with a genuine crisis at least in terms of western confidence, Bilderberg, which exists and operates solely to promote the interests of established structures of privilege and power, is now hunkered down to such a degree that it has very nearly disappeared from sight again. For their part, the media, which is reliably obedient to the same insider interests, have purposefully let it disappear.

Image based on a work by Anthony Freda called Presstitute

This year’s location was announced at the eleventh hour thanks to the charade of their annual “press release”: a nod to transparency since they already know the press has no interest whatsoever in reading and reporting on it. And top of this year’s ‘key topics’ (attached to the same press release) was how a system that serves their own plutocratic agenda can survive, or, put in the language of Bilderberg, how to maintain “a stable strategic order”. When it comes to confessing their priorities, could they be any more frank with us?

*

1 Opening lyrics to the famous track “Smoke on the water”. The inspiration for the song was a fire inside a casino that the band had witnessed across the water of Lake Geneva. Here ‘mobile’ actually refers to a type of recording studio although given today’s context seems to fittingly allude to more contemporary methods of audiovisual recording.

2 From an article entitled “Silicon Valley in Switzerland: Bilderberg 2019 and the High-Tech Future of Transatlantic Power” written by Charlie Skelton published in Newsweek on June 1, 2019. https://www.newsweek.com/silicon-valley-switzerland-bilderberg-2019-and-high-tech-future-transatlantic-1441259

3 Quote taken from Brave New World Revisited (1958), Chapter 3, by Aldous Huxley.

4 Georgetown University awarded Quigley its Vicennial Medal in 1961 and also the 175th Anniversary Medal of Merit in 1964.

5 http://www.4president.org/speeches/billclinton1992acceptance.htm

6

“In 1891, Rhodes organized a secret society with members in a “Circle of Initiates” and an outer circle known as the “Association of Helpers” later organized as the Round Table organization… In 1909-1913, they organized semi-secret groups know as Round Table Groups in the chief British dependencies and the United States. In 1919, they founded the Royal Institute of International Affairs. Similar Institutes of International Affairs were established in the chief British dominions and the United States where it is known as the Council on Foreign Relations. After 1925, the Institute of Pacific Relations was set up in twelve Pacific area countries.”

Extract taken from Tragedy and Hope: A History of the World in our Time written by Carroll Quigley, The Macmillan Company, 1966, pp131-2.

7 ibid. p. 950

8 ibid. p. 946

9 ibid. p..950

10 ibid. p. 952

11 “The Bank for International Settlements was established in 1930. It is the world’s oldest international financial institution and remains the principal centre for international central bank cooperation.” taken from current BIS website.

12 ibid. p. 324

13 In 2018 (Turin) then-President of Ciudadanos, Albert Rivera Díaz, (ESP) joined Spanish Deputy Prime Minister Sáenz de Santamaría, Soraya of Partido Popular. Rivera Díaz also attended in 2017 (Chantilly) this time alongside then-Minister of Economy, Industry and Competitiveness, Luis de Guindos (Partido Popular), soon after appointed Vice President of the European Central Bank. In 2016 (Dresden), Bilderberg welcomed Luis Garicano, Professor of Economics, LSE and Senior Advisor to Ciudadanos.

14

Stacey Abrams announced on Tuesday that she would not run for Senate in 2020, denying Democrats their favored recruit for the race in Georgia. She did not say if she planned to run for president, which she has also been considering doing.

From an article entitled “Stacy Abrams Will Not Run for Senate in 2020, written by Alexander Burns, published in The New York Times on April 30, 2019. https://www.nytimes.com/2019/04/30/us/politics/stacey-abrams-2020.html

15 From an interview with Jean-Luc Mélenchon conducted by David Broder for The Tribune, published under the title: “Everyone should know – I am very dangerous”. [I am currently unable to find an upload of this piece]

Leave a comment

Filed under analysis & opinion, Charlie Skelton, Switzerland

the unreal thing

The following article is Chapter Eight of a book entitled Finishing The Rat Race which I am posting chapter by chapter throughout this year. Since blog posts are stacked in a reverse time sequence (always with the latest at the top), I have decided that the best approach is to post the chapters in reverse order.

All previously uploaded chapters are available (in sequence) by following the link above or from category link in the main menu, where you will also find a brief introductory article about the book itself and why I started writing it.

*

Advertising is the rattling stick inside a swill bucket”

George Orwell

*

“Take a card, any card, it’s your choice… but don’t let me see what it is.” The magician fans the cards flamboyantly. We know it’s a trick of course. “Three of Clubs,” he tells us. We shake our heads dismissively – after all, we’re part of the act. The magician seems momentarily perplexed. “Do you have anything in your jacket pocket?” he asks as if desperately trying to turn our attention away from his apparent failure. We feel inside and find a sealed envelope. It’s the one we’d signed earlier in the performance. “Is the seal broken?” he asks, knowingly. “Open it – what’s inside?” We scratch our heads and quietly applaud. Somehow the magician has diverted our attention just long enough to construct the illusion of an altered reality. In truth his method was to “force” the card, and so his illusion relied on the simple fact that we really hadn’t a free choice at any stage. But we applaud because we admire his harmless deception. It amuses us to be deceived once in a while.

*

I saw an advert the other day. It read “Say No to No” which is the kind of quasi-Zen mumbo-jumbo that advertising executives get paid a small fortune to write. What was the effect of that advertisement? Well, it had suddenly interrupted my original train of thought. I’d probably been looking for the cigarette lighter or wondering how the living room table was so heaped up in junk again, but now I was reading on about how negativity gets in the way of progress. And which company, I kept wondering as I’d read down, would attach themselves to such a manifestly new age positive-thinking banner? I read on and came to examples of human achievements that left to the nay-sayers could never have happened:

“Yes, continents have been found…”, it read.

Found? By Columbus in 1492, presumably, and then Australia by James Cook. And no human had set eyes on them before? Obviously this is a rhetorical question. I read on…

“Yes, men have played golf on the moon…”

American men to be more precise. And it was indeed an incredible and truly awesome achievement – not the golf, but the travelling to the moon. When it comes to golf, there are obviously far superior facilities a lot closer to home. I read on…

“Yes, straw is being turned into biofuel to power cars…”

Well, hardly in the same league as exploration to such distant lands, but finally some inkling to where they were leading me…

I studied the picture more carefully. The words “Say no to no” are in thick capitals near the top of a blackboard already filled with images of progress and science – molecular structures, conical sections, a diagram showing a spherical co-ordinate system, graphs, line drawings of electron orbits and DNA, of animals and a ship and of course the ubiquitous pie-chart. A girl, her long straw-blond hair tied back into a pony-tail, and wearing a bright red tank top, has her back turned toward to us. She is reaching high, almost on tip-toe, into the black and white and adding the upward flourish of a spiral. Perhaps I was looking at one of those recruitment adverts for teaching, yet something told me otherwise…

And there it was – I’d found it at last – deliberately placed outside the main frame of the picture; a small, emblematic containment for all that progress: a remote, red and yellow scollop shell. The message was far from loud, but that was the point. And once spotted it was very clear, yet it had been intentionally delivered at a subliminal level – out of picture, unobtrusive, easily missed. Its instruction surreptitious and beyond the margins. Why? Because they wanted me to attach the ideas of positivity and progress to the symbol of a multinational oil corporation just as surely as Pavlov’s dogs associated lunch with the ringing of their owner’s bell. They wanted me to feel good things the next time I saw the scollop and to never even think about why.1

*

Advertising is simply another act of illusion and as with the performing stage magician, the audience is well aware that they are being tricked. But in advertising the illusion runs deeper, so that aside from the obvious aim of persuading us to buy Coke instead of Pepsi or whatever, it very often constructs a host of other frauds. Take again the advert mentioned above as an example, with the girl reaching up on tip-toe. Here nothing is accidental, with all parts and relationships operating together to reinforce our idea of progress as a constant striving toward a better world, whilst in the background, it only quietly dismisses any “nay-sayers” who disagree. Like many predators, advertisers work by stealth, often, as here, offering glimpses of Utopia, or of wonderful and perpetual advancement, to draw us on and in. The carrot on a stick swinging endlessly before the eyes of the befuddled donkey.

But then, on other occasions, they will take a different tack, and get out a proper stick. They’ll make us uneasy about our looks, or our lack of social status, before offering a quick fix for these problems so frequently of their own devising. There are many ways to ring our bells: both carrots and sticks are equally effective.

And then everyone says this: “Adverts don’t work on me.” So these companies spend literally billions of pounds and dollars on refining their illusions, posting them up all across our cities and towns, filling our airwaves with their jingles and sound-bites, not to mention the ever-widening device of corporate sponsorship, and yet still this remains as our self-deluding armour against such unending and ever more sophisticated assaults. I’ll bet you could find more people who’d say David Copperfield can really fly than would actually admit to being significantly influenced by advertising.

*

There probably never was a time when advertising was just that: a way to make products and services more widely or publicly known about. In such a time, adverts would have just showed pictures of the product and a simple description of its uses and/or advantages. “This is the night mail crossing the border…” – that sort of thing.

Though, of course, here immediately is a bad example, because the famous post office film is not only reminding us of what a jolly useful and efficient service our mail delivery is, but how wonderfully hard the GPO work whilst the rest of us are asleep. So on this different level Auden’s famous homage is a feel good thing, encouraging us to connect our good feelings to the postal service; it is an early example of public relations although still harmless enough in its quiet way.

But audiences get wise, or so we like to imagine, and so today’s advertisers have had to up the ante too. Gone are the days of telling you how to have “whiter whites” or advising everyone (with only a hint of surrealism) to “go to work on an egg”. Nowadays you’re far more likely to choose to eat a certain chewy stick because “it’s a bit of an animal” (without even noticing the entirely subliminal reference to your feelings about being carnivorous) or drink a can of soft drink because “image is nothing” (which presumes a ridiculous double-think on the part of the targeted purchaser). And where once a famous Irish beverage was just “good for you”, now it’s better because it comes “to those who wait”. Here you’re asked to make an investment in the form of time; an investment that is intended to add personal value to the brand.

Adverts are loaded with these and other sorts of psychological devices – cunningly latent messages or else entertaining ways of forging brand loyalty. They prey on the fact that we are emotional beings. They use tricks to bypass our rational centres, intending to hard-wire the image of their products to our feelings of well-being, happiness, contentment, success, or more simply, the image we have of ourselves. They use special words. LOVE for instance. Just see how many adverts say “you’ll love it”, “kids love it”, “dogs love it”, “we love it”, and so on and so on…. one I saw recently for condoms said simply “love sex” – talk about a double whammy!

Advertisers also like to scare us. When they are not showing us washing lines drying over the Fields of Elysium, or happy pals sharing time with packets of corn snacks, or elegant cars effortlessly gliding down open highways; they are constructing worlds of sinister dangers. Germs on every surface, and even in “those hard to reach places”. Threats from every direction, from falling trees to falling interest rates. I once saw an TV advert that showed a man desperately running from a massive and menacing fracture. It was a crack that seemed to be ripping through the very fabric of space and time, an existential terror relentlessly chasing after him through some post-apocalyptic nightmare. After a minute or so the threat abated and a solution was offered. Get your windscreen checked, it calmly advised.

And the government get in on this too. Watch out, watch out, there a thief about! Just say no to drugs! Sex is fun, but take precautions and don’t die of ignorance! In these ways, they ramp up fears of the real dangers we face, whilst also inculcating a sense of trust in the powers that be. The world is a perilous and unjust place, they say (which is true); fortunately, we are here to help you. Trust us to guide you. Obey our instructions. To protect you and your loved ones. To help you to realise your dreams. Together, we will make the world a fairer place. The constant PR refrains: “Believe”, “Belong”, “Trust”, and more recently, “Hope and Change”. O, ring out those bells!

*

Right now, there’s something refreshingly honest about smoking. Those of us who refuse or are unable to quit are left under absolutely no illusions about our little cancer sticks. We know perfectly well that each drag is bringing the grave that little bit closer. And it’s certainly not cool to smoke. Our clothes stink, our breath stinks, and stinking, we huddle outdoors, rain or shine, cluttering up the office doorways with our toxic fumes and heaps of fag-ends. But it wasn’t always so. Smoking had its golden age. A time when cigarettes were an accoutrement to style and when sharing a fag with a dame was nearly as great as sex.2 During this period, the tobacco industry invested a small fortune in maintaining their myth. They paid to lobby politicians, they made funds available for favourable medical research, and perhaps most significantly of all, they hired the best PR man in the business.

It can be fun to speculate on who were the most influential figures in history. Who would we wish to include? Great statesmen, formidable warriors, innovators, engineers, scientists and artists, when lists are polled for, the public generally take their pick from these, chucking in the odd saint or celebrity just for good measure. They choose between Churchill, Washington, Alexander the Great, Thomas Edison, and Albert Einstein, and if the criteria are widened to include villains as well as heroes, plump for Adolf Hitler, Mao Tse Tong, and Joseph Stalin. A selection, if you like, of the stars of the show. But what about people whose work involves them behind the scenes? What of those whose greater skill was to remain invisible or simply unnoticed? Edward Bernays was just such a man.

*

To say that Bernays was a great PR man is to do him a considerable disservice, for Bernays, who happened to also be a nephew of no lesser light than Sigmund Freud, is nowadays regarded as the father of modern PR. He wrote the book. Rather candidly he entitled it simply Propaganda – the word deriving from the Latin for “propagation” was less sullied back in 1928. In the opening chapter Bernays lays out the situation as he sees it:

“The conscious and intelligent manipulation of the organized habits and opinions of the masses is an important element in democratic society. Those who manipulate this unseen mechanism of society constitute an invisible government which is the true ruling power of our country.”

But Bernays is not warning us here, far from it. This is merely the way the world works, spinning along in a fashion that Bernays regards are both inevitable and to a great extent desirable. Better an orderly world of unseen manipulation than a world of ungovernable chaos. And it’s this point which he makes perfectly explicit in the very next paragraph:

“We are governed, our minds are molded, our tastes formed, our ideas suggested, largely by men we have never heard of. This is a logical result of the way in which our democratic society is organized. Vast numbers of human beings must cooperate in this manner if they are to live together as a smoothly functioning society.”3

We should perhaps not be surprised to learn then that Bernays’ book was one that didn’t make it onto the bonfires of the Third Reich. Instead, Joseph Goebbels publicly praised Bernays’ work as especially influential, saying that it had formed the blueprint for his own Nazi propaganda machine. Certainly, it is a very practical guide. It delves into a great many areas and asks important questions. One of the most significant questions it asks goes as follows:

“If we understand the mechanism and motives of the group mind, is it not possible to control and regiment the masses according to our will without their knowing it?”4

And the answer, as Bernays went on to prove with his amazing success in promoting everything from bacon and eggs to soap powder and political candidates, was HELL YES!

Working for the American Tobacco Company, Bernays had even piggy-backed a ride on the women’s rights movement. Offering encouragement to the fairer sex, for whom smoking in public was still very much a taboo, to keep on lighting their “Torches of Freedom.” Not that any similar strategy could work today obviously… well, not unless those torches were organically-grown by fair-trade tobacco farmers and rolled in chlorine-free paper supplied by sustainable forests, or whatever.

Bernays was the great promoter, perhaps the greatest, and he was keen to promote his own product, modern advertising, or as he called it propaganda, above all else. For Bernays, just as for his acolyte Joseph Goebbels, the future was propaganda:

“Propaganda will never die out. Intelligent men must realize that propaganda is the modern instrument by which they can fight for productive ends and help to bring order out of chaos.”5

*

Following Bernays, advertising no longer stops at breakfast cereals, toothpaste and petrochemical companies, having extended its parasitic tendrils throughout all areas of life, so that image becomes everything. Newspapers and magazines are glossier than ever. They radiate forth into the empty void of secular consumerist existence, visions of earthly fulfilment that can be bought (at preferential interest rates) – holidays, home improvements, house moves (especially abroad), fast cars, and millionaire lifestyles.

They tell us what is right to think about: beauty, health, fashion and that oh-so elusive of attributes, style. They tell us “how to get on”. They tell us what’s worth worrying about. DO worry about your wrinkles. DO worry about your waistline. DO worry about your split-ends. DO WORRY – because you’re worth it! Just as importantly we get to learn what is worth thinking about: success, fame and glamour, which when multiplied together make celebrity. Celebrity: from the Latin celebrare meaning to celebrate, or to honour. So whereas the ancients believed that the fixed and eternal heavenly stars were gods, we instead are sold a parallel myth revolving around “the stars of today”.

But newspapers and magazines are nothing, for their influence pales into insignificance when set in comparison to that flickering blue screen in the corner of the living room. It is our gateway to another world, a parallel dimension, where we are welcomed back each day by our virtual friends. It is a fire to warm us. A shadow-play of mesmerising potency. And here, the ever-tantalising jam of tomorrow has finally slopped over from its earlier containment within commercial breaks, to become what is now a mainstay for entire broadcasting schedules. Carrots and sticks for us to nod along to, 24/7, and three hundred and sixty-five days of the year.

It’s not even that all television is bad. Some is excellent. I would cite as an exemplar the consistently superior content of BBC wildlife documentaries, which far exceed any comparable alternative whether offered by books, radio, or at the cinema. Here is television at the very pinnacle of its achievement.

A great deal on television is produced just to amuse us, or amaze us, and occasionally, actually to inform us, and much of this merits credit too, but I do not feel it necessary to waste time pushing an open door. We all know that television can sometimes be marvellous. But we also know that most of it is junk. Junk that, with the influx of multiple digital channels, is spread ever more thinly and widely. In a modern world television certainly has its place, but we will do well never to forget its unprecedented powers:

“Right now there is an entire generation that never knew anything that didn’t come out of this tube. This tube is the gospel, the ultimate revelation. This tube can make or break president’s hopes… This tube is the most awesome God-damn force in the whole godless world, and woe is us if ever it falls in the hands of the wrong people…

And when the twelfth largest company in the world controls the most awesome God-damned propaganda force in the whole godless world, who knows what shit will be peddled for truth on this network. So you listen to me – Listen to me – Television is not the truth. Television’s a god-damned amusement park…

We’re in the boredom killing business… But you people sit there day after day, night after night, all ages, colours, creeds – We’re all you know – You’re beginning to believe the illusions we’re spinning here. You’re beginning to think that the tube is reality and that your own lives are unreal. You’ll do whatever the tube tells you. You’ll dress like the tube, you’ll eat like the tube, you’ll raise your children like the tube. You even think like the tube. This is mass madness. You maniacs! In God’s name, you people are the real thing – we are the illusion.”

Of course, if you’ve seen the film Network, from which this extraordinary rant is taken, then you’ll also be aware that these are the words of a madman!6

At the top of the chapter I quoted Orwell’s no-nonsense assessment of advertising, and advertising is indeed as he describes it: the rattling stick eliciting the same Pavlovian response in the pigs, as advertising executives wish to implant in our human minds. Their main intent to push their client’s products by making us salivate with desire. This was no different in Orwell’s time. Whilst advertising’s still more ugly parent, propaganda, has always aimed to change minds more fundamentally. It treats ideas as products and sells them to us. But the techniques in both advertising and propaganda have come a long way since Orwell’s time.

This power to propagandise has grown in large part because of television. The blue screen softly flickering away in the corner of every living room having opened up a possibility for thousands of ‘messages’ each day to be implanted and reinforced over and over. Unconsciously absorbed instructions to think in preformed patterns being precisely what Aldous Huxley thought would be needed if ever the seething and disorderly masses of any ordinary human population might be replaced by the zombie castes of his futuristic vision Brave New World. “Sixty-two thousand four hundred repetitions make one truth”, he wrote.7

Which is a joke, but like so much in Huxley’s work, a joke with very serious intent. Huxley’s vision of a future dystopia being subtler in ways to Orwell’s own masterpiece Nineteen Eighty-Four, not least because the mechanisms of mind control are wholly insidious. Huxley showing how you don’t have to beat people into submission in order to make them submit. Yet even Huxley never envisaged a propaganda system as pervasive and powerful as television has eventually turned out to be.

*

Advertising involves “the art of deception” and it has never been more artful than it is today… sly, crafty, cunning, scheming, devious, sneaky, and totally calculating. However, it is increasingly artful in that other sense too: being achieved with ever greater creative skill. Indeed, the top commercials now cost more than many feature films, and, aside from paying small fortunes for celebrity endorsement, the makers of our grandest and most epic commercials take extraordinary pains to get the details right.

Engineered to push the buttons of a meticulously studied segment of the population, niche marketing techniques ensure precise targeting with optimum impact. Every image, sound and edit honed, because time is money when you’re condensing your ‘message’ into thirty seconds. It is perhaps not surprising therefore that these commercial ‘haikus’ as regarded by some as the works of art of our own times. A view Andy Warhol (himself a former ‘commercial artist’) espoused and helped promote – though mostly he made his fortune espousing and promoting his own brand: a brand called Andy Warhol.

Warhol wrote that:

“The most beautiful thing in Tokyo is McDonald’s. The most beautiful thing in Stockholm is McDonald’s. The most beautiful thing in Florence is McDonald’s. Peking and Moscow don’t have anything beautiful yet.”8

Russian composer Igor Stravinsky is credited with a far better joke, having once remarked that “lesser artists borrow, but great artists steal”. As with Warhol’s quip, it fits its author well. Stravinsky here downplaying his unrivalled talent for pastiche, whereas Warhol could never resist hiding his gift for nihilism in plain sight.

But actually, advertising isn’t art at all, of course. Do I need to continue? It is a bloodless imitation that neither borrows nor steals, to go back to Stravinsky’s aphorism, but directly counterfeits. Feigning beauty and faking truth is all it knows, with a passing interest in the first in so far as it is saleable, and a pathological aversion to the second, since truth is its mortal enemy.

For if selling us what we least require and never thought we desired is advertising’s everyday achievement (and it is), then pushing products and ideas that will in reality make our lives more miserable or do us harm is its finest accomplishment. And the real thing? Like the stage magician, this is what the admen assiduously divert your attention away from.

Which brings me a story. A real story. Something that happened as I was driving to work one dark, dank February morning. A small thing but one that briefly thrilled and delighted me.

It was at the end of Corporation Street, fittingly enough I thought, where someone had summoned the courage to take direct action. Across the glowing portrait of a diligently air-brushed model were the words: “She’s not real. You are beautiful.”

That some anonymous stranger had dared to write such a defiant and generous disclaimer touched me. But it didn’t end there. This person, or persons unknown, had systematically defaced all three of the facing billboards, saving the best for last. It was for one of those ‘messages’ that is determined to scare some back into line, whilst making others feel smug with a glow of compliant superiority. It read: “14 households on Primrose Street do not have a TV licence” (or words to that effect).

The threat, though implicit, was hardly veiled. In Britain, more than a hundred thousand people ever year are tried and convicted for not having a TV licence. Some are actually jailed.9 But now this message had a graffiti-ed punchline which again brought home the hidden ‘message’ perpetuated by all of advertising. The spray-canned response read simply: “perhaps they’ve got a life instead.” A genuine choice the admen wouldn’t want you to consider. Not buying into things isn’t an option they can ever promote.

To add my own disclaimer, I in no way wish to encourage and nor do I endorse further acts of criminal damage – that said, here is a different piece of graffiti (or street art – you decide) that I happen to walk past on my way into work. In a less confrontational way, it too has taken advantage of an old billboard space:

the best things in life

Next chapter…

*

Addendum: a modest proposal

We are all living under a persistent and dense smog of propaganda (to give advertising and PR its unadorned and original name). Not only our product preferences and brand loyalties, but our entire Weltanschauung10 fashioned and refashioned thanks to a perpetual barrage of lies. Fun-sized lies. Lies that amuse and entertain. Lies that ingratiate themselves with fake smiles and seductive whispers. And lies that hector and pester us, re-enforcing our old neuroses and generating brand new ones. These lies play over and over ad nauseam.

Ad nauseam, the sickness of advertising, is a man-made pandemic, with modern commercials selling not simply products per se, but “lifestyles”. And think about that for a moment. Off-the-shelf ideals and coffee table opinions that are likewise custom-made. Beliefs to complement your colour-coordinated upholstery, your sensible life insurance policy, your zesty soap and fresh-tasting, stripy toothpaste.

Thanks to television, we inhale this new opium of the people all day long and few (if any) are immune to its intoxication, but then advertising operates at a societal level too – since by disorientating individuals, society as a whole becomes more vulnerable to the predatory needs of corporations. So cuddling up to the box and laughing along to the latest blockbuster commercial on the grounds that “adverts don’t affect me” just makes our own delusion complete.

I might have ended on a lighter note, but instead I’ll hand over to the late Bill Hicks at his acrimonious best (and apologises for his foul and abusive language, but unfortunately here it is fully warranted):

“By the way, if anyone here is in marketing or advertising kill yourselves…”

Bill pauses to absorb any cautious laughter, then quietly continues: “Just a thought… I’m just trying to plant some seeds. Maybe, maybe one day they’ll take root… I don’t know, you try, you do what you can…”

Still scattering handfuls of imaginary seeds, but now sotto voce for suggestive effect: “Kill yourselves…”

Another pause and then completely matter of fact. “Seriously though – if you are – do!”

And now Bill gets properly down to business: “Ahhh – No really – There’s no rationalisation for what you do and you are Satan’s little helpers okay… Kill yourselves. Seriously. You are the ruiners of all things good. Seriously. No, No, this is not a joke… Ha,ha, there’s going to be a joke coming… There’s no fucking joke coming! You are Satan’s spawn filling the world with bile and garbage. You are fucked and you are fucking us – Kill yourselves – It’s the only way to save your fucking soul – kill yourself…”

Then he comes to the crux of the matter: “I know what all you marketing people are thinking right now too: ‘Oh, you know what Bill’s doing. He’s going for that anti-marketing dollar. That’s a good market. He’s smart…’ – Oh Man! I’m not doing that! You fucking evil scumbags! – ‘You know what Bill’s doing now. He’s going for the righteous indignation dollar. That’s a big dollar. Lot of people are feeling that indignation. We’ve done research – huge market! He’s doing a good thing.’ – God damn it! I’m not doing that you scumbags…! Quit putting the dollar sign on every fucking thing on this planet!”

If we are ever to break free from the mind-forged manacles of the advertising industry then we might consider the option of broadcasting Bill Hicks’ rant unabridged during every commercial break on every TV channel on earth for at least a year – the obscenities bleeped out in broadcasts before the watershed!

While we’re about it, we will need a screening prior to every movie (during the commercial slots obviously) as well as key phrases rehashed into jingles and those same sound bites written up in boldface and plastered across every available billboard. Now, if you think this would be altogether too much of an assault on our delicate senses then please remember that is precisely what the dear old advertising industry does day-in and day-out. So wouldn’t it would fun to turn the tables on those in the business of deceit? And not simply to give them a dose of their own snake oil, but to shock us all with repeated jolts of truth instead.

*

Please note that for the purposes of ‘publishing’ here I have taken advantage of the option to incorporate hypertext links and embed videos – in order to distinguish additional commentary from the original text all newly incorporated text has been italised.

*

1 Incidentally, my young nephew had added a few scribbles of his own to this advertisement and it is interesting to note where he directed his pen marks, five places in all: one over each of the girls hands, one on the back of her head and another on her ponytail. And his only scribble that was not on the girl was on top of the scollop. Bullseye!

2 Of course in Hollywood films of a bygone age when censorship was strict, sharing a fag was also used as a metaphor for sex itself.

3 Taken from the opening to Chapter 1 entitled “Organising Chaos” of Propaganda, by Edward Bernays (1928).

4 Ibid. Chapter 4, “The psychology of Public Relations”

5 Ibid. Chapter 11, “The mechanics of propaganda”

6 “I’m as mad as hell, and I’m not going to take this anymore!” These are the words of anti-corporate evangelist Howard Beale, taken from the film Network (1976). A satire about a fictional television network called Union Broadcasting System (UBS), with its unscrupulous approach to raising the ratings, Network was written by Paddy Chayefsky and directed by Sidney Lumet. Most memorably, it features an Oscar-winning performance by actor the Peter Finch, playing the part of disaffected news anchor Howard Beale. Beale, having threatened to commit suicide live on air, is given his own show. Billed as “the mad prophet”, he steals the opportunity to angrily preach against what he sees as the corporate takeover of the world, and steadily his show gathers the largest audience on television. The consequences are, of course, inevitable.

7 “One hundred repetitions three nights a week for four years, thought Bernard Marx, who was a specialist on hypnopædia. Sixty-two thousand four hundred repetitions make one truth. Idiots!” From Chapter 3 of Brave New World by Aldous Huxley, published in 1932. 

8 Quote taken from Chapter 4 “Beauty” of The Philosophy of Andy Warhol: (From A to B and Back Again), published in 1975. 

9 “According to the most recent figures, about 70 people a year are jailed for TV licence fee offences. But the scale of prosecutions for licence fee evasion is far higher and now accounts for one in nine of all Magistrates Court cases. More than 180,000 people – almost 3,500 a week – appeared before the Magistrates Courts in 2012, accused of watching television without a valid licence in, with 155,000 being convicted and fined.”

From an article entitled ‘Dodging TV licence will not be a crime’ written by Tim Ross, published in The Telegraph on March 7, 2014. http://www.telegraph.co.uk/culture/tvandradio/bbc/10684639/Dodging-TV-licence-will-not-be-a-crime.html

10 Weltanschauung: a particular philosophy or view of life; the world view of an individual or group.

Leave a comment

Filed under analysis & opinion, « finishing the rat race »

the price of everything

The following article is Chapter Nine of a book entitled Finishing The Rat Race which I am posting chapter by chapter throughout this year. Since blog posts are stacked in a reverse time sequence (always with the latest at the top), I have decided that the best approach is to post the chapters in reverse order.

All previously uploaded chapters are available (in sequence) by following the link above or from category link in the main menu, where you will also find a brief introductory article about the book itself and why I started writing it.

*

When the accumulation of wealth is no longer of high social importance, there will be great changes in the code of morals. We shall be able to rid ourselves of many of the pseudo-moral principles which have hag-ridden us for two hundred years, by which we have exalted some of the most distasteful of human qualities into the position of the highest virtues. We shall be able to afford to dare to assess the money-motive at its true value. The love of money as a possession — as distinguished from the love of money as a means to the enjoyments and realities of life — will be recognised for what it is, a somewhat disgusting morbidity, one of those semi-criminal, semi-pathological propensities which one hands over with a shudder to the specialists in mental disease…”

John Maynard Keynes 1

*

Have you ever wondered what it’s like to be rich? Here I don’t just mean well-off, with a paltry few tens of millions in the bank, I mean proper rich – megabucks! So much money that, as I heard one comedian put it (aiming his joke squarely at the world’s richest entrepreneur), if Bill Gates were to stuff all his cash under the mattress, then due to interest alone, if he fell out of bed he’d never hit the ground!

I suppose what I’m wondering is this – and perhaps you’ve found yourself thinking along similar lines – why are these super-rich guys always so intent on accruing ever greater wealth when they already possess more than enough funds to guarantee the needs of a small country. Think about it this way: Gates and the others are, barring a few very necessary legal constraints, completely at liberty to do whatever they choose at every moment of every day. They can eat the best food, drink the most delicious vintage wines, smoke the finest cigars, play golf morning, noon, and evening, and then after the sun goes down, and if it is their wont, have liaison with the most voluptuous women (or men) available. Quite literally, they have means to go anywhere and do everything to their heart’s content and all at a moment’s notice. Just imagine that. So why be bothering about sales at all? I mean wouldn’t you eventually get bored of simply accumulating more and more money when you’ve already got so much – and let’s face it, money itself is pretty boring stuff. So just what is it that keeps them all going after it? After all, there are only so many swimming pools, grand pianos, swimming pools in the shape of grand pianos, Aston Martins, Lear Jets, and acreages of real estate that one man (or woman) can profitably use (in the non-profit-making sense obviously). Economists would call this the law of diminishing marginal utility, although in this instance it is basic common sense.2

Presented with evidence of this kind, some will say that here is further proof of the essential greediness of human beings. That, as a species, we are simply never satisfied until we have the lot. Fine then, let us take on this modern variant of original sin, since it certainly holds more than a grain of truth. For the sake of argument, we might presume that all men and women are greedy to an almost limitless extent. That this is truly the natural order, from our conception having been evolutionarily programmed to grab as much as we can for ourselves – our most primeval reflex being to snatch.

So I shall not waste too much time here. Only to say that I do not find such unrestrained cupidity within the circles of people with whom I have chosen to associate, most being happy enough to share out the peanuts and fork out for the next round of beers, quite oblivious to outcomes in terms of commensurate returns. What comes around goes around… There is, of course, no doubting that most folks will, very naturally, if opportunity arises, take good advantage to feather their own nests. Making life a little more comfortable for themselves, and reserving the ample share of their fortune for their immediate family and closest friends. But then, why not…? Charity begins at home, right?

What most don’t do (at least in the circles I know best) is devote their whole lives to the narrow utilitarian project outlined above. And why? Because, though quite understandably, money and property are greatly prized assets, they offer lesser rewards than companionship and love. And, in any case, pure generosity is its own reward – and I do mean “is”, and not “has” or “brings” – the reward being an inseparable part of the act itself: a something received as it was given, like a hug, like a kiss. That said, if you still prefer to believe that we are all to a man, woman and child, innately and incurably selfish and greedy, then next time you take a look into the mirror, do consider those all-too beady eyes staring back. It’s very easy to generalise about mankind when you forget to count yourself in.

But if not intractably a part of human nature, then we must find other reasons to account for how our world is nevertheless so horribly disfigured by rampant and greedy exploitation. For if greed is not an inherently human trait, and here I mean greed with a capital Grrr, then this monomaniacal obsession is all too frequently acquired, especially in those who approach the top of the greasy pole. There is an obvious circularity in this, of course. That those whose progress has depended upon making a buck, very often become addicted. As money-junkies, they, like other addicts, then prioritise their own fix above all else. Whether or not these types are congenitally predisposed to becoming excessively greedy, we have no way of knowing. What we can be certain of is this: that by virtue of having acquired such great wealth, they disproportionately shape the environment they and we live in. So they are not merely money-junkies, but also money-pushers. If you’re not a money-junkie then you don’t know what you’re missing. There’s nothing new in this. This is the way the world has been for many centuries, and perhaps ever since money was first invented.

So here’s Oscar Wilde addressing the same questions about money and our unhealthy relationship to it; his thoughts leaping more than a century, during which time very little has apparently changed:

“In a community like ours, where property confers immense distinction, social position, honour, respect, titles, and other pleasant things of this kind, man, being naturally ambitious, makes it his aim to accumulate this property, and goes on wearily and tediously accumulating it long after he has got far more than he wants, or can use, or enjoy, or perhaps even know of. Man will kill himself by overwork in order to secure property, and really, considering the enormous advantages that property brings, one is hardly surprised. One’s regret is that society should be constructed on such a basis that man has been forced into a groove in which he cannot freely develop what is wonderful, and fascinating, and delightful in him – in which, in fact, he misses the true pleasure of joy and living.”3

Embedded below is a recent interview [from December 2013] Pulitzer Prize-winning journalist Chris Hedges gave on “The Real News” in which he talked about – based to a large extent on his own personal experience – how the super rich are isolated and disconnected from the rest of society. He explains how this creates a deluded sense of entitlement and a pathological callousness:

*

Isn’t money funny stuff! Funny peculiar, I mean. We just take it so much for granted, almost as though it were a natural substance (disappointingly, of course, it doesn’t actually grow on trees). But when we do think about it, money has far stranger properties than anything in the natural world. And our relationship to it is more peculiar than our relationship to almost anything else.

Money, that’s what I want… sang the Beatles on one of their less celebrated tracks. But the truth will out. So just why did the Beatles want money, and, for that matter, why do I, and why do you? It doesn’t work, you can’t eat it, and it’s not, of a rule, a thing of special beauty. Money is absolutely useless in fact, right until you decide to swap it for what you actually want.

Money can’t buy me love, true again, but it might buy me a chocolate bar. Because money is really just a tool, a technology: a highly specialised kind of lubricant, that enables people to exchange their goods and services with greater ease and flexibility. The adoption of a money system enabling levels of parity for otherwise complex exchanges to be quickly agreed and settled. The great thing about money being, to provide a concrete illustration, that although £1 of tinned herring is probably equivalent to about thirty seconds of emergency plumbing (if you’re lucky), you won’t require crates of herring to pay for the call-out. So far so simple.

Except wait. We all know how the price of herring can go up as well as down, and likewise for the price of emergency plumbers. So why such a dynamic relationship? Well, there’s “the market”, a price-fixing system that arises spontaneously, regulating the rates of exchange between goods and services on the basis of supply adjusting to match demand. Thus by a stroke of good fortune, we find that money is not merely a lubricant for exchange, but also regulatory of useful production and services. This, at least, is the (widely accepted) theory.

Prices rise and fall in accordance with demand. Things that are in short supply become expensive, things that are abundant are cheaper. This is basic economic theory and it means, amongst other things, that in every transaction the “real value” of your money is actually relative, for the simple reason that the amount required depends not only on what you’re after, but also upon whether or not other people are after the same kind of thing. Money then, in terms of its “real value” to any individual or group, is something that is constantly varying. We might call this “the relativity of money”.

One consequence of the relative nature of money, is that the useful value of money overall can also rise and fall. It is possible that wholesale, retail and labour costs can all more or less rise or fall together, although the general tendency, as we all know from experience, is for overall rising costs. Indeed such “inflation” is regarded as normal and expected, and, as a consequence, it comes to seem just as natural as money itself. Yet since you always need more and more money to buy the same things then the value of your money must, in some important way, be constantly falling. But just why does money as a whole lose its value in this way? What makes yesterday’s money worth less than today’s? Well it turns out that this is a huge question and one that economists have argued long and hard about.

One partial account of inflation goes as follows: businesses and people in business are constantly looking for a little bit more. For how else can they maximise profits? In direct consequence, we, as customers, necessarily require more dosh to pay for the same goods or services. But to enlarge our budget, this automatically requires a commensurate increase in income, which means successfully negotiating for a larger salary. In the bigger picture then, the businesses supplying our wants and needs, are now needing to cover their larger wage-bills, which means higher prices to compensate. So prices and incomes rise together, with money becoming worth less and less precisely because everyone is trying to accumulate more and more of it. This endless tail-chasing escalation, which is given the fancy title of “the price/wage spiral”, serves as an excellent example of why money is really very odd stuff indeed.

And what is money in any case? The first traders most likely exchanged shells, precious stones, or other baubles to aid in bartering, but then naturally enough, over time these exchanges would have been formalised, agreements arising with regards to which objects and materials were most acceptable as currency. The material that became most widely accepted was eventually, of course, gold. But why gold? Well, no one actually knows but we can make some educated guesses.

Firstly, gold is scarce, and it is also rare in other ways – for instance, having a unique and unusual colour, which just happens to correspond to the colour of the Sun. The fact that it is almost chemically inert and so doesn’t tarnish, means that it also shines eternally, and so again is like the Sun. Indeed, Aldous Huxley, in Heaven and Hell (his sequel to The Doors of Perception) points out that almost every substance that humans have ever regarded as valuable shares this property of shininess. To Huxley this is evidence that even money owes it origins, in part at least, to a common spiritual longing. Our wish to own a precious piece of paradise.

But back to more mundane matters, if gold (or any other substance) is chosen as your currency, then there arises another problem. How to guarantee the quantity and quality of the gold in circulation? For if gold is worth faking or adulterating then it’s certain that somebody will try cheating.

Well, one answer could be the adoption of some kind of official seal, a hallmark, and this solution leads, naturally enough, to the earliest forms of coinage. But then, if the coins are difficult to counterfeit, why bother to make them out of gold in the first place? Just the official seal would be enough to ensure authenticity. And why bother with metal, which is bulky and heavy. So again it’s an obvious and logical leap to begin producing paper banknotes. The value of these coins and banknotes, although far less intrinsically valuable in material terms than the gold they represent, is still backed by the promise that they are redeemable into gold. But hang on, what’s so special about the gold anyway (aside from its shininess). And doesn’t the gold, which is now locked up in bullion reserves, in fact have real uses of its own? And doesn’t this mean that the gold also has a monetary value? So why not cut loose from the circularity and admit that the value of money can exist entirely independent from the gold or from any other common standard. Indeed, why couldn’t the issuing authority, which might be a government but is more often a central bank, simply make up a “legal tender”4 with no intrinsic or directly correlated value whatsoever and issue that? Not that the money issued need even correspond to the amount of real coins or paper banknotes in circulation – most of the world’s money being bits and bytes, ones and zeroes, orbiting out in cyber-space. Which brings us to just how funny money has now become.

The Pound Sterling, the various dollars, the Euro and every major currency on Earth are, to apply the correct terminology, “fiat currencies”5 With fiat currencies there is no parity to the value of any other commodities and so they are, if you like, new forms of gold. As such, and given their shifting relative values, these new fiat currencies can also be traded as another kind of commodity. Money, in the form of currency, becoming an investment in itself. Money is strange stuff indeed.

Yet money also remains as an instrument. And we use this instrument to measure just about everything. To establish the value of raw materials and manufactured items. The value of land and, by extension, the value of the space it occupies. The value of labour, and thus a value on the time used. And, since works of art are also bought and sold, money is even applied as a measure of such absolutely intangible qualities as beauty.

So money is basically a universally adaptable gauge, and this is its great strength. It is perhaps the big reason why its invention gradually caught on in such a fundamental way. From humble trading token, money has risen to become a primary measure of all things. But remember, remember… Money, whether fiat currency or gold standard, can never be real in the same way as tins of herring and plumbers are real, and neither is “monetary value” an absolute and intrinsic property, but only ever relative and acquired. Money, we ought to constantly remind ourselves (since we clearly need reminding) is nothing without us or without our highly structured civilisation – intrinsically, it is worthless. It is very strange stuff.

Perhaps the future benchmark for money will no longer be gold but ‘virtual gold’ in the form of cryptocurrencies – bitcoin being currently the most well-known of these. One advocate of these alternatives to traditional forms of money is financial expert Max Keiser. On February 3rd 2014, he spoke with coder, hacker and cryptocurrency specialist Andreas Antonopoulos about the regulation of bitcoin transactions; the advent of bitcoin derivatives, which he believes these are less of a threat than ordinary derivatives (a subject I’m coming to next); the fact that unlike gold, cryptocurrencies can be ‘teleported’; and a future in which bitcoin is used widely by businesses as much as by individuals. He says that a time is coming when the prevalent misgivings and doubts about bitcoin and other cryptos have long since been forgotten. Is he right? I don’t know and remain highly skeptical, but I find the debate an interesting one:

Incidentally, there are less radical and more tangible alternatives to the currencies we now have in circulation. “Treasury notes” are one such alternative and these have historical precedence in the form of both the American “greenback” and the UK’s Bradbury Pound. To read more about this and also for links to campaigns to reintroduce them please read the addendum at the end of the chapter.

*

Little more than a century ago, and even in the richest corners of the world, there were no dependable mechanisms to safeguard against the vicissitudes of fortune. If you weren’t already poor and hungry (as most were), then you could rest assured that potential poverty and hunger were waiting just around the corner. Anyone with aspirations to scale the ladder to secure prosperity faced the almost insurmountable barriers of class and (a generally corresponding) lack of education. A lower class person of such ambitions would be very well aware that if they could step onto the ladder at all, there was very little in the way of protection to save them in the event of falling; errors of judgement or sheer misfortune resulting in almost certain and unmitigated personal disaster. This was the sorry situation for people at all levels of society aside from the highest echelons.

One tremendous advantage then, of living in a modern society, is that, aside from having slightly less restricted social mobility (not that we now live in the classless society we are told to believe in), there are basic safety nets in place, with additional protection that is optionally available. For those languishing at the bottom of the heap, there are the reliable though meagre alms provided through a welfare system, whilst for the ever-expanding middle classes there is plenty of extra cover in the form of saving schemes, pension schemes, and, in the event of the most capricious and/or calamitous of misfortunes, the ever-expanding option of insurance policies. If the Merchant of Venice had been set in today’s world then the audience would feel little sympathy for his predicament. Why had he ventured on such a risk in the first place, casting his fortune adrift on dangerous waters? Why hadn’t he protected his assets by seeking independent financial advice and taking out some preferential cover? It’s a duller story altogether.

Systems for insurance are essential in any progressive civilisation. Protection against theft, against damage caused by floods, fires and other agents of destruction, and against loss of life and earnings. Having insurance means that we can all relax a bit, quite a lot, in fact. But it also means that, alongside the usual commodities, there’s another less tangible factor to be costed and valued. That risk itself needs to be given a price, and that necessarily means speculating about the future.

Indeed, speculations about the future have become very much to the forefront of financial trading. As a consequence of this, at least in part, today’s financial traders have become accustomed to dealing in “commodities” that have no intrinsic use or value whatsoever. They might, for example, exchange government bonds for promises of debt repayment. Or, feeling a little more adventurous, they might speculate on the basis of future rates of foreign exchange, or in interest rates, or share prices, or rates of inflation, or in a multitude of other kinds of “underlying assets” (including that most changeable of underlying variables: the weather) by exchange of promissory notes known most commonly as “derivatives”, since they derive their value entirely on the basis of the future value of something else. And derivatives can be “structured” in any myriad of ways. Here are a just few you may have heard of :–

  • futures (or forwards) are contracts to buy or sell the “underlying asset” up until a future date on the basis of today’s price.
  • options allow the holder the right, without obligation (hence “option”), to buy (a “call option”) or to sell (a “put option”) the “underlying asset.”
  • swaps are contracts agreeing to exchange money up until a specified future date, based on the underlying value of exchange rates, interest rates, commodity prices, stocks, bonds, etc.

You name it: there are now paper promises for paper promises of every conceivable kind. Now the thing is that because you don’t need to own the “underlying asset” itself, there is no limit to the amounts of these paper promises that can be traded. Not that this is as novel as it may first appear.

Anyone who’s ever bought a lottery ticket has in effect speculated on a derivative, its value in this case being entirely dependent upon the random motion of coloured balls in a large transparent tumbler at an allocated future time. All betting works this way, and so all bets are familiar forms of derivatives. And then there are, if you like, negative bets. Bets you’d rather lose. For instance, £200 says my house will burn down this year, is presumably a bet you’d rather lose, but it is still a bet that many of us annually make with an insurance company. And general insurance policies are indeed another form of familiar derivative – they are in effect “put options”.

However there is one extremely important difference here between an ordinary insurance policy and a “put option” – in the case of the “put option”, you don’t actually need to own the “underlying asset”, which means, to draw an obvious comparison, you might take out house insurance on your neighbour’s property rather than your own. And if their house burns down, ah hum accidentally, of course, then good for you. Cash in your paper promise and buy a few more – who knows, perhaps your neighbour is also a terrible driver. There are almost numberless opportunities for insuring other people’s assets and with only the law preventing you, then why not change the law. Which is exactly what has happened, with some kinds of derivatives circumventing the law in precisely this way, and permitting profitable speculation on the basis of third party failures. When it comes to derivatives then, someone can always be making a profit come rain or shine, come boom or total financial meltdown.

But, why stop there? Especially when the next step is so obvious that it almost seems inevitable. Yes, why not trade in speculations on the future value of the derivatives themselves? After all, treating the derivative itself as an “underlying asset” opens the way for multiple higher order derivatives, creating with it, the opportunity for still more financial “products” to be traded. Sure, these “exotic financial instruments” quickly become so complex and convoluted that you literally need a degree in mathematics in order to begin to decipher them. Indeed those on the inside make use of what are called “the Greeks”, and “the Higher Order Greeks”, since valuation requires the application of complex mathematical formulas comprised of strings of Greek letters, the traders here fully aware, of course, that it’s all Greek to the rest of us. Never mind – ever more financial “products” means ever more trade, and that’s to the benefit of all, right…?

Deregulation of the markets – kicked off in Britain by the Thatcher government’s so-called “Big Bang” and simultaneously across the Atlantic through the laissez-faire of “Reagonomics”6 – both enabled and encouraged this giddying maelstrom, allowing in the process the banking and insurance firms, the stockbrokerage and hedge funds that make up today’s “finance industry” to become the single most important “wealth creator” in the Anglo-American world. Meanwhile, declines in manufacturing output in Britain and America meant both nations were becoming increasingly dependent on a sustained growth in the financial sector – with “derivatives” satisfying that requirement for growth by virtue of their seemingly unbound potential. Indeed, having risen to become by far the largest business sector simply in terms of profit-making, many of the largest banks and insurance groups had become “too big to fail”7. Failure leading potentially to national, if not international, economic ruin. Which is how the very systems that were supposedly designed to protect us, systems of insurance, have, whether by accident or design, left us more vulnerable than ever.

And then came the bombshell, as we learnt that the banks themselves were becoming bankrupt, having gambled their investments in the frenzy of deregulated speculation. Turns out that some of the money-men didn’t fully understand the complexity of their own systems; a few admitting with hindsight that they’d little more knowledge of what they were buying into than the rest of us. They’d “invested” because their competitors “invested”, and, given the ever-growing buoyancy of the markets at the time, not following suit would have left them at a competitive disadvantage. A desperate but strangely appropriate response to the demands of free market capitalism gone wild.

*

It is currently estimated that somewhere in the order of a quadrillion US dollars (yes, that’s with a qu-) has been staked on derivations of various kinds. Believe it or not, the precise figure is actually uncertain because many deals are brokered in private. In the jargon of the trade these are called “over the counter” derivatives, which is an odd choice of jargon when the only thing the average customer buys over the counter are drugs. Could it be that they’re unconsciously trying to tell us something again?

So just how big is one quadrillion dollars? Well, let’s begin with quadrillion. Quadrillion means a thousand trillion. Written at length it is one with a string of fifteen zeros. A number so humungous that it’s humanly impossible to properly comprehend: all comparisons fail. I read somewhere that if you took a quadrillion pound coins and put them side by side then they would stretch further than the edge of the solar system. The Voyager space programme was, of course, a much cheaper alternative. Or how about this: counting a number every second, it would take 32 million years to count up to a quadrillion… Now obviously that’s simply impossible – I mean just try saying “nine hundred and ninety-nine trillion, nine hundred and ninety-nine billion, nine hundred and ninety-nine million, nine hundred and ninety-nine thousand, nine hundred and ninety-nine” in the space of one second! You see it really doesn’t help to try to imagine any number as big as a quadrillion.

However, there are still useful ways to compare a quadrillion dollars. For instance, we can compare it against the entire world GDP which turns out to be a mere 60 trillion US dollars8. One quadrillion being nearly twenty times larger. Or we might compare it against the estimated monetary wealth of the whole world: about $75 trillion in real estate, and a further $100 trillion in world stock and bonds. So one quadrillion is a number far exceeding even the total monetary value of the entire world – material and immaterial! A little freaky to say the least! Especially when we discover that many of these derivatives are now considered to be “toxic assets”, which is a characteristically misleading way of saying they are worth nothing – yes, worthless assets! – whatever the hell that means!

So just like the Sorcerer’s Apprentice, it seems that the spell has gone out of control, and instead of these mysterious engines making new money out of old money, the system has created instead an enormous black hole of debt. A debt that we, the people, are now in the process of bailing out, with extremely painful consequences. Efforts to save us from a greater catastrophe having already forced the British and US governments to pump multiple hundreds of billions of public money into the coffers of the private banks. Yet the banks and the economy remain broken of course, because how is any debt larger than the monetary value of the entire world ever to be repaid?

Another tactic to halt descent into a full-blown economic meltdown has involved the issuance of additional fiat currency in both Britain and America; a “quantitative easing” designed to increase the supply of money by simply conjuring it up (a trick that fiat currency happily permits). Money may not grow on trees but it can most certainly be produced out of thin air. But here’s the rub. For in accordance with the most basic tenets of economic theory, whenever extra banknotes are introduced into circulation, the currency is correspondingly devalued. So you may be able to conjure money from thin air, but all economists will readily agree that you cannot conjure “real value”, meaning real purchasing power. Indeed this common mistake of confusing “nominal value” (i.e., the number of pounds written on the banknote) with “real value”, is actually given a name by economists. They call it: “the money illusion”. And it’s useful to remind ourselves again that money has only relative value.

To understand this, we might again consider money to be a commodity (which in part it is, traded on the currency markets). As such, and as with all other commodities, relative scarcity or abundance will alter its market value, and, in obedience to the law of supply and demand, more will automatically mean less. This is just as true for the value of money as it is for tins of herring, plumbers, scotch eggs and diamonds. So it seems that if too much of our quantitative is eased, then we’d better be prepared for a drastic rise in inflation, or much worse again, for hyperinflation. Printing too much money is how hyperinflation has always been caused.

Our future is bleak, they tell us. Our future is in the red. So much for security, so much for insurance. We’d apparently forgotten to beware of “the Greeks” and of the “higher order Greeks” when they’d first proffered gifts.

*

I said earlier, just in passing, that money is actually pretty boring stuff, and it is… Truly, madly and deeply boring! So when I hear on the news how “the markets” are hoping that the latest round of “quantitative easing” will enable governments to provide the necessary “fiscal stimulus”, I am barely even titillated. Whilst explanations, both in the popular press and supposedly more serious media, that like to describe such injections of new money as in some way analogous to filling up my car with imaginary petrol provide me only with a far, far more entertaining distraction: to wit, a magical car that runs on air.

But then, of course, money isn’t really stuff at all! More properly considered, money is perhaps a sort of proto-derivative, since its worth is evidently dependent upon something other than the paper it’s (increasingly not) written on. So what is it that money’s worth depends upon? What underlies money? Well, the accepted answer to this question is apparently that money is a “store of value”. Although this leads immediately to the obvious follow-up question: in this context, what precisely is the meaning of “value”? But, here again there is a problem, since “value”, although a keystone to economic thinking, has remained something of an enigma. Economists unable to agree upon any single definitive meaning.

Is “value” a determinant of usefulness? Or is it generated by the amount of effort required in the production of things? Or perhaps there is some other kind of innate economic worth? For instance in a thing’s scarcity. And can this worth be attributed at the individual level or only socially imputed?

There are a wide variety of definitions and explanations of “value”, that, being so foundational, have then encouraged the various branches of economic theory to diverge. And here is another important reason why economics is in no way equivalent to the physical sciences. Ask any physicist what energy is, and they will provide both an unambiguous definition and, no less importantly, offer established methods for measurement. Because of this, if ever one physicist talks to another physicist about energy (or any other physical quantity) they can be absolutely certain that they are talking about the same thing. Which is very certainly not the case when economists talk about “value”.

“A cynic is a man who knows the price of everything and the value of nothing,” said Oscar Wilde, distinguishing with playful wisdom the difference in human terms between “price” and “value”. The great pity is that the overwhelming majority of today’s economists have become so cynical – but then perhaps they always were.

*

As part of his on-going assault against religion, Richard Dawkins recently published a book called The God Delusion. It’s the old hobby-horse again; one that he shares with a great many millions of other broadly liberal, literate and intelligent people. That religion is an evil of which humanity must rid ourselves totally. And yes, much of religion has been dumb and dangerous, this I will very readily concede (and already have conceded in earlier chapters). But really and truly, is it “the God delusion” that we should be most concerned about in these torrid times? For regardless of Dawkins claims, it is quite evident that religion is a wounded animal, and for good or ill, the secular world is most certainly in the ascendant. Right throughout the world, aside from a few retreating pockets of resistance, faith in the old gods has been gravely shaken. It is not that human faith, by which I mean merely a belief and/or worship of something greater, is extinguished, for it never can be, but that it has been reattached to new idol-ologies. And in those parts of the world where the old religions have been most effectively disarmed or expelled, namely the West, one idol-ology above all others has gathered strength from Religion’s demise.

Richard Dawkins has said many times that instructing young children in religious obedience is a form of psychological child abuse and on this point I wholeheartedly support him. Children’s minds are naturally pliable for very sound developmental reasons. But is it less pernicious to fill their precious minds with boundless affection for let’s say Ronald McDonald? For this is merely one stark but obvious illustration of how a new fundamentalism has been inculcated in the young. Devotion to the brand. Love of corporations. Worship of the dollar and the pound.

This new kind of fundamentalism has long since swept across the world, but it is unusual, although not unique, in that it denies its own inherent religiosity whilst claiming to have no idols. This is the fundamentalism of free market neoliberal economics. The Father, Son and Holy Ghost having been forsaken, only to have been usurped by the IMF, the World Bank and the WTO. If you think I’m joking, or that this is mere hyperbole, then think again. When things are tough we no longer turn to the heavens, but instead ask what sacrifices can be made to “reassure the markets”. Sacrifices to make it rain money again.

By far and above, here is the most pernicious delusion of our age. And it has next to nothing to do with God, or Yahweh, or Allah, or even the Buddha. The prophets of our times talk of nothing besides profits or losses. They turn their eyes to the Dow Jones Index, trusting not in God, but only in money. So I call for Dawkins to leave aside his God delusion, for a moment, and pay a little attention to the rise and rise of “the money delusion”. If future historians reflect on our times, this is what they will see, and given the mess this “money delusion” is creating they will scratch their heads in disbelief and disgust.

*

I have already discussed the so-called “money illusion” – of mistaking nominal banknote value for real purchasing value – but this is merely one of many nested and interrelated illusions that make up “the money delusion”. Illusions that have become so ingrained within our permitted economic thinking that they are completely taken for granted.

Foundational is the belief that individuals always make rational choices. According to the definition of making rational choices, this requires that we all choose with consistency and always with the aim of choosing more over less. That a huge advertising industry now exists to tempt us into irrationality is never factored in. Nor are the other corrosive influences that so obviously deflect our rational intentions: the coercion of peer pressure, our widespread obsession with celebrities and celebrity endorsement, and that never-ending pseudo-scientific babble that fills up many of the remaining column inches and broadcast hours of our commercial media. We are always eager for the latest fashionable fads, and perhaps we always were. Yet this glaring fact, that people make wholly irrational choices time and again, whether due to innate human irrationality or by deliberate design, is of little concern to most economists. It is overlooked and omitted.

Likewise, a shared opinion has arisen under the name of neoliberalism that economics can itself be neutral, usefully shaping the world without the nuisance of having to rely on value judgements or needing any broader social agenda. If only individuals were left to make rational choices, as of course they do by definition, or so the idea goes, and the market could also be unshackled, then at last the people will be free to choose. Thus, goes the claim, individual freedom can only be guaranteed by having freedom within the marketplace. Freedom trickling down with the money it brings. “Wealth creation” alone must solve our problems by virtue of it being an unmitigated good.

Of course, back in the real world, one man’s timber very often involves the destruction of another man’s forest. Making profits from the sale of drugs, tobacco and alcohol has social consequences. Factories pollute. Wealth creation has its costs, which are very often hidden. There is, in other words, and more often than not, some direct negative impact on a third party, known to economists as “spillover” or “externalities”, that is difficult to quantify. Or we might say that “wealth creation” for some is rather likely therefore to lead to “illth creation” for others.

Illth creation? This was the term coined by romantic artist, critic and social reformer, John Ruskin, and first used in his influential critique of nineteenth century capitalism entitled Unto This Last. Ruskin had presumably never heard of “the trickle-down effect”:

“The whole question, therefore, respecting not only the advantage, but even the quantity, of national wealth, resolves itself finally into one of abstract justice. It is impossible to conclude, of any given mass of acquired wealth, merely by the fact of its existence, whether it signifies good or evil to the nation in the midst of which it exists. Its real value depends on the moral sign attached to it, just as sternly as that of a mathematical quantity depends on the algebraical sign attached to it. Any given accumulation of commercial wealth may be indicative, on the one hand, of faithful industries, progressive energies, and productive ingenuities: or, on the other, it may be indicative of mortal luxury, merciless tyranny, ruinous chicane.”9

*

We are in the habit of regarding all money as equal. Presuming that the pounds and pence which make up my own meagre savings are equivalent in some directly proportional manner to the billions owned by let’s say George Soros. A cursory consideration shows how this is laughable.

For instance, we might recall that on “Black Wednesday” in 1992, Soros single-handedly shook the British economy (although, the then-Chancellor of the Exchequer Norman Lamont was left to shoulder the blame)10. But to illustrate this point a little further, let me tell you about my own small venture into the property market.

Lucky enough to have been bequeathed a tidy though not considerable fortune, I recently decided to purchase a house to live in. The amount, although not inconsiderable by everyday standards (if compared say with the income and savings of Mr and Mrs Average), and very gratefully received, was barely sufficient to cover local house prices, except that I had one enormous advantage: I had cash, and cash is king.

For reasons of convenience, cash is worth significantly more than nominally equivalent amounts of borrowed money. In this instance I can estimate that it was probably worth a further 20–30%. Enough to buy a far nicer house than if I’d needed to see my bank manager. A bird in the hand…

Having more money also has other advantages. One very obvious example being that it enables bulk purchases, which being cheaper, again inflates its relative value. The rule in fact is perfectly straightforward: when it comes to money, more is always more, and in sufficient quantities, it is much, much more than that.

But then, of course, we have the market itself. The market that is supposedly free and thus equal. The reality being, however, that since money accumulates by virtue of attracting its own likeness, the leading players in the market, whether wealthy individuals or giant corporations, by wielding larger capital resources, can operate with an unassailable competitive advantage. These financial giants can and do stack the odds even higher in their favour by more indirect means, such as buying political influence with donations to campaign funds and by other insidious means such as lobbying – all of which is simply legally permitted bribery. The flaunted notion of a free market is therefore the biggest nonsense of all. There is no such thing as a free market: never has been and never will be.

The most ardent supporters of free market neoliberalism say that it is a non-normative system, which permits us finally to rid ourselves of disagreements over pesky value judgements. The truth, however, is very much simpler. By ignoring values, it becomes a system devoid of all moral underpinning. Being morally bankrupt, it is unscrupulous in the truest sense of the word.

*

If I had enough money and a whim, I might choose to buy all the plumbers and tins of herrings in Britain. Then, since money is (in part) a measure of scarcity, I could sell them back later with a sizeable mark-up. Too far-fetched? Well, perhaps, but only in my choice of commodity. The market in other commodities has without any question been cornered many times in the past. For instance, by the end of the 1970s, two brothers, Nelson Bunker and William Herbert Hunt, had accumulated and held what was then estimated to be one third of all the world’s silver. This led to serious problems both for high-street jewellers11 and for the economy more generally12, and as it happened, when the bubble burst on what became know as “Silver Thursday”, it also spelt trouble for the brothers’ own fortune. Fortunately for them, however, the situation was considered so serious that a consortium of banks came forward to help to bail them out13. They had lost, their fortune diminished, although by no means wiped out. As relatively small players they’d played too rough; meanwhile much larger players ensure that the markets are routinely rigged through such manufacture of scarcity. Going back as early as 1860, John Ruskin had already pointed out a different but closely-related deficiency in any market-driven capitalist system of trade:

“Take another example, more consistent with the ordinary course of affairs of trade. Suppose that three men, instead of two, formed the little isolated republic, and found themselves obliged to separate, in order to farm different pieces of land at some distance from each other along the coast: each estate furnishing a distinct kind of produce, and each more or less in need of the material raised on the other. Suppose that the third man, in order to save the time of all three, undertakes simply to superintend the transference of commodities from one farm to the other; on condition of receiving some sufficiently remunerative share of every parcel of goods conveyed, or of some other parcel received in exchange for it.

“If this carrier or messenger always brings to each estate, from the other, what is chiefly wanted, at the right time, the operations of the two farmers will go on prosperously, and the largest possible result in produce, or wealth, will be attained by the little community. But suppose no intercourse between the landowners is possible, except through the travelling agent; and that, after a time, this agent, watching the course of each man’s agriculture, keeps back the articles with which he has been entrusted until there comes a period of extreme necessity for them, on one side or other, and then exacts in exchange for them all that the distressed farmer can spare of other kinds of produce: it is easy to see that by ingeniously watching his opportunities, he might possess himself regularly of the greater part of the superfluous produce of the two estates, and at last, in some year of severest trial or scarcity, purchase both for himself and maintain the former proprietors thenceforward as his labourers or servants.”14

By restricting the choices of others, one’s power over them is increased, and it this that brings us to the real reason why money becomes such addiction, especially for those who already have more than they know what to do with. For truly the absolute bottom line is this: that money and power become almost inseparable unless somehow a separation can be enforced. And whilst wealth, especially when excessive, accumulates, as it almost invariably does, then along with it goes the accumulation of power. This is underlying and centralising mechanism has perhaps always operated at the heart of all civilisation. But even the power of money has its limits, as Ruskin points out:

“It has been shown that the chief value and virtue of money consists in its having power over human beings; that, without this power, large material possessions are useless, and to any person possessing such power, comparatively unnecessary. But power over human beings is attainable by other means than by money. As I said a few pages back, the money power is always imperfect and doubtful; there are many things which cannot be reached with it, others which cannot be retained by it. Many joys may be given to men which cannot be bought for gold, and many fidelities found in them which cannot be rewarded with it.

“Trite enough, – the reader thinks. Yes: but it is not so trite, – I wish it were, – that in this moral power, quite inscrutable and immeasurable though it be, there is a monetary value just as real as that represented by more ponderous currencies. A man’s hand may be full of invisible gold, and the wave of it, or the grasp, shall do more than another’s with a shower of bullion. This invisible gold, also, does not necessarily diminish in spending. Political economists will do well some day to take heed of it, though they cannot take measure.”15

Until such a time, every action and probable outcome must continue to be evaluated on the basis of strict cost and benefit estimates. Our “ponderous currencies” literally enabling a figure to be set against each human life – an application fraught with the most serious moral dilemmas and objections – and beyond even this, we have price tags for protecting (or else ruining) the natural environment all our lives depend upon. For only the market can secure our futures, optimally delivering us from evil, though inevitably it moves in mysterious ways. Which is how the whole world – land, water, air and every living organism – came to be priced and costed. Everything set against a notional scale that judges exclusively in terms of usefulness and availability, such is the madness of our money delusion.

We are reaching a crisis point. A thoroughgoing reappraisal of our financial systems, our economic orthodoxes, and our attitudes to money per se is desperately required. Our survival as a species may depend on it. Money ought to be our useful servant, but instead remains, at least for the vast majority, a terrible master. As a consequence, our real wealth has been too long overlooked. Time then for this genii called money to be forced back tight inside its bottle. Ceaselessly chasing its golden behind, and mistaking its tight fist for the judicious hand of God, is leading us ever further down the garden path. Further and further away from the land it promises.

Next chapter…

*

 Addendum: Q & A

Back in April 2012, I forwarded a draft of this chapter to friends in Spain (a nation already suffering under imposed “austerity measures”). They sent an extended reply which raised two interesting and important questions. Both questions along with my replies are offered below:

Q1: You seem to be saying that printing money (as the US and UK, who are in control of their own currency, are doing ) is as bad as dealing with the debt problem by means of austerity (the “Merkozy” approach). But the latter is surely definitely worse.

A. I think these are simply two sides of the same scam. The bankers create an enormous unpayable debt and then get governments to create new money to bail them out. This is sold to us as a way of bailing out a few chosen victims (Greece, Spain, Portugal, Ireland) although it simply means a huge transfer of wealth from public into private hands. To make that money useful to the bankers (and the rest of the ruling elite) ‘austerity measures’ are put in place which not only steal money off the average person but also permit the fire sale of national assets. Meanwhile, in Britain and America, the governments are helping to pay for these bailouts by creating money out of thin air, which means the real value of our money is reduced through inflation (effectively a hidden tax). If the money were invested in infrastructure or education or whatever, then this could potentially be a good thing (even though it still creates inflation), so certainly QE could have been beneficial but not when you use the money only to keep afloat a huge Ponzi scheme. But then you ask later…

Q2: ‘but how come the pound is high now and the euro low’

A. That’s a very good question and I won’t pretend that I understand this completely, but I gather there are plenty of ways for keeping currencies higher than they ought to be by manipulating the markets [incidentally, the Forex Scandal to manipulate and rig the daily foreign exchange rates did not come to light until Summer 2013]. The market is rigged in any case by virtue of the fact that the dollar remains the world’s reserve currency and that oil is traded entirely in dollars. But essentially what’s going on here is a huge currency war, and the euro is constantly under attack from speculators. I am fairly certain that the chickens will come home to roost sooner or later in America and Britain (and in Germany too), but meanwhile the governments simply go about cooking the books and telling us how inflation is only 4% or whatever when fuel prices, for instance, have rocketed during the past few years. In any case, we get ‘austerity’ too, not as hardline yet as the ‘austerity’ being imposed elsewhere, but it will come – of this I have no doubt. Either it will happen slowly, or worse, there will be a huge war and the ‘austerity’ will be brought into place to justify the expense of that. This is a deliberate attack by the bankers against the people of the world, and until the people of the world say that’s enough, and most of the debts are cancelled outright, I don’t see any way this can be reversed.

*

Another topic I briefly touched upon in the chapter above is the matter of inflation. What is it and what causes it? My answers were sketchy, in part, because I wished to avoid getting too bogged down in technicalities beyond my training. But this question about the causes of inflation is, in any case, an extremely thorny one. Different schools of economists provide different explanations.

One less orthodox account that I have frequently come across is that our fractional reserve banking system when combined with a central bank’s issuance of a fiat currency is inherently inflationary. That in the long term, and solely because of these extant monetary mechanisms, inflation is baked into the cake. So I wrote to a friend who holds with the above opinion and asked if he would explain “in the briefest terms that are sufficient” why he and others believe that central bank issuance of currency and fractional reserve banking are the primary underlying cause of inflation. Here is his succinct but detailed reply:

In a central bank system, money is created in the first instance by governments issuing bonds to banks and banks “printing” money and handing it over to the government in return. The government then owe the banks the money plus interest. If they ever pay back any of the principal, then a corresponding amount of bonds are handed back, i.e. cancelled. In that case, the money repaid goes out of existence!

Before elaborating any further, let’s take a step back. Fractional reserve lending doesn’t require central banks, nor does it require governments to create money by issuing bonds in exchange for it. Fractional reserve lending is simply the act of taking someone’s money to “look after it”, then turning around and lending a fraction of it to someone else. If the lender has enough depositors, then sum of all the unlent fractions of each deposit should cover him if one of them suddenly comes through the door asking for all their money back in one go. As I’m sure you know, if too many turn up at once looking for their money, a run ensues. Fractional reserve banking doesn’t even require a government sanctioned paper currency to exist. Depositors can simply deposit something like gold and the lenders can issue receipts which become the paper currency.

In olden times, when depositors of gold first found out that the goldsmiths they were paying to store their gold safely were lending it out for a percentage fee, they were outraged. The goldsmiths appeased them by offering them a cut of the fee for their interest in the scam. Accordingly, this money became known as ‘interest’.

So where do central banks fit in? Countries like the Unites States prior to 1913 have operated without central banks. There were thousands of banks of all sizes. To compete with one another, they had to endeavour to offer higher interest to depositors, lower interest rates to borrowers or to cut the fraction of deposits that they kept in reserve. This latter aspect was what caused banks occasionally to go to the wall, to the detriment of their depositors.

Central banking avoids this risk because the same fractional reserve ratio applies to all the banks under a central bank’s jurisdiction. However, it is really a way to avoid competition and if the system ever does get into trouble, the government feel obliged to bail it out or risk collapse of the whole system.

Now to answer your question about inflation.

In a fractional reserve central bank system, money is created as I’ve described by the government issuing bonds to the bank, receiving money created out of thin air and having to pay interest on it. When they spend it by paying salaries of government employees, contractors, arms manufacturers and so on, that money goes straight into bank accounts and the bankers can’t wait to lend out as much of it as possible, up to the limit of whatever fractional reserve ratio applies. So now there is a double claim on the money. The government employee thinks their salary is sitting in the bank but 90 percent of it is in the pocket of a borrower who thinks it’s theirs as long as they keep up with interest. That borrower, will inevitably either put the borrowed sum in their own bank account or spend it. Either way it will end up in another bank account somewhere. Then the same thing happens again; up to 90 percent of it gets lent out (81 percent of the original government-created money) and so on…

We end up in a situation where all of the money in circulation has arisen from someone somewhere, signing the dotted line to put themselves in debt. The money isn’t backed by a commodity such as gold. Instead it is backed by the ability of the borrower to repay. All these borrowers, including the government are paying interest. If interest is to be paid on every penny in circulation, then it doesn’t take a genius to figure out that new money must be continuously ‘created’ to keep paying this. That occurs by governments constantly borrowing so that their debts keep on increasing and borrowers constantly borrowing more and more. This seems to work as long as prices, wages and asset values keep increasing. Generation after generation, workers can afford to pay more and more for the houses that they live in because the price of the house keeps going up so it looks like good collateral to the lender and also their wages keep going up, so the borrower can meet payments in the eyes of the lender.

Working out what the rate of inflation is at any given time is practically impossible. Government figures such as RPI and CPI are just another tool for the propagandists to use as they see fit at any given time. However for the banks to gain anything from the game, the rate of inflation must be:

  • less than the rate of interest paid by borrowers and;
  • greater than the rate of interest paid to savers.

This is why savers money is ‘eroded’ if they just leave it sitting in a bank account.
Now imagine a different system where:

  • governments issue paper money by printing it themselves;
  • the amount in circulation is absolutely fixed;
  • there is no central bank but there are plenty of independent banks.

In such a country, there is no need for the government to have any debt and there is ample historical evidence of nations that have existed without government debt for very long stretches of time. What borrowers there are have to find the interest by earning it from the fixed pool of currency that is in circulation. There is little need for anyone to borrow but that’s something that most people you speak to have difficulty accepting. That’s because they’ve only ever lived in a system where they spend their lives in the service of debt and cannot conceive of it being any different.

The bankers right at the top of the system aren’t out to grab hold of all the money in the world. They’re not after all the tangible in the world either. Their only goal is to ensure that as much human labour as possible is in the service of debt.

Now for something different. How can this whole thing go horribly wrong for the bankers? I don’t just mean a run on banks or a recession. That happens periodically and is known as the business cycle. People lose confidence and are reluctant to borrow for a number of years, then they regain confidence and start to borrow again and the whole thing picks up and the cycle repeats.

What can go horribly wrong is if, after generations and generations and generations of increasing prices and debts, everyone gets more spooked by debt than ever before and totally fixated on repaying it. They sell assets but there are so many folk doing that that asset prices start to decline. That spooks people further. A spiral is under way. Banks try to ‘stimulate’ the economy by lowering interest rates but there is very little confidence around, especially if asset prices are declining compared with debts and wages aren’t rising either (or may be in decline), so that the ability to repay debt is impaired. This decline can be long and protracted. Also there can be many ups and downs along the way, although the long term trend is down. Ups can be deceptive as they are perceived as “coming out of the recession” by those used to the normal business cycles we’ve experienced throughout the whole of the twentieth century. In this way, asset prices can bleed away until eventually they reach something like a tenth of of their peak value. This process can reach a very late stage before a lot of people recognise what’s really going on. This is just a scenario but one worth considering seriously. We could be in for long term deflation but it will be well under way and too late for many people in debt by the time it gets mainstream acknowledgement.

A closely-related question and one that automatically follows is why do countries bother having central banks at all? Instead of a government issuing bonds, why not directly issue the currency instead, thereby cutting out the middle men? It is an approach that actually has a number of historical precedents as pointed out in this open letter to Obama urging him to reissue ‘greenbacks’ and the campaign in Britain to print ‘treasury notes’ like the Bradbury Pound. So in a further reply to my friend I asked him, “do you think that the re-issuance of ‘greenbacks’ in America or the Bradbury Pound in the UK might offer a realistic solution to the current crisis?” His response:

The issue of greenbacks or whatever you call them (essentially government-issued money) would probably make no immediate difference. Already, the money created by quantitative easing is not working its way into the system, so why would money issued by any other means?

In the longer term, such a fundamental upheaval would make a huge difference as the government wouldn’t need to be in debt the whole time and people wouldn’t have to keep paying increasing prices for houses and cars on top of interest. Pensioners wouldn’t be on a treadmill, having to ‘invest’ their savings just in vain an effort to keep up with inflation.

There’s a risk that the government might be tempted to print more and more money, which is often cited as a point in favour of the present system. It is claimed that having to pay interest and ultimately repay the whole principal is a disincentive in this respect. However, the current system ensures constant “printing” all the time as there’s no way that everyone involved can pay interest otherwise.

There’s talk at the moment about banks charging people a few percent for holding their money on deposit, i.e “negative interest”. People think they’ll lose money as their account balances will go down over time. However, it’s no different to being paid say six percent interest at a time when inflation is at 9 percent and the cheapest loan you can get is 12 percent.

I’m amazed at how people in the alternative media can inform us that banks are going to charge us ‘negative interest’ for our deposits, express outrage and then in the next breath claim that we’re in a hyperinflationary environment. Low/negative interest is a sure sign of massive deflationary pressure. I don’t know what’s going to happen but I’m convinced that deflation’s the one to watch. It has the potential to catch people out.

Getting back to your original question, the direct issuing of money by the government would represent a seismic shift of power from bankers to governments; a shift in the right direction, no doubt. It’s only possible if everyone knows what’s exactly going on. We’re a very long way off yet. Peoples’ understanding of the banking scam is very very poor.

I would add that very much front and centre in that scam is the role of the central banks. These extraordinarily powerful commercial bodies that adopt the outward appearance of public institutions when in fact they work for commercial interests. The US Federal Reserve, for instance, is a de facto private corporation and all of its shareholders are private banks. The status of the Bank of England is more complicated. This is what the main wikipedia entry intriguingly has to tell us:

Established in 1694, it is the second oldest central bank in the world, after the Sveriges Riksbank, and the world’s 8th oldest bank. It was established to act as the English Government’s banker, and is still the banker for HM Government. The Bank was privately owned [clarification needed (Privately owned by whom? See talk page.)] from its foundation in 1694 until nationalised in 1946.[3][4] 

Original references retained.

Clarification needed indeed! Anyway, nowadays it is officially (since 1998) an ‘independent public organisation’. However, the BoE is not really as independent as it might first appear, since along with eighteen other central banks from around the world (including the US Federal Reserve) it is a member of the executive of “the central bank for central banks” – the little known Bank for International Settlements (BIS) based in Basel, Switzerland. To hear more about the history, ownership and function of this highly profitable (tax free and extraterritorial) organisation, I recommend listening to this interview with Adam LeBor, author of the recently released book The Tower of Basel:

For my own more detailed thoughts on effective remedies to the on-going financial crisis please read this earlier post.

*

Please note that for the purposes of ‘publishing’ here I have taken advantage of the option to incorporate hypertext links and embed videos – in order to distinguish additional commentary from the original text all newly incorporated text has been italised.

*

1 From “The Future”, Essays in Persuasion (1931) Ch. 5, John Maynard Keynes, CW, IX, pp.329 — 331, Economic Possibilities for our Grandchildren (1930).

2 Adam Smith applied “the law of diminishing utility” to solve “the paradox of water and diamonds”. Water is a vital resource and most precious to life and yet it is far less expensive to purchase than diamonds, comparatively useless shiny crystals, which in his own times would have been used solely for ornamentation or engraving. The reason, Smith decides, is that water is readily abundant, such that any loss or gain is of little concern to most people in most places. By contrast, the rarity of diamonds means that, although less useful overall, any loss or gain of use is more significant, or to put it more formally the “marginal utility” is greater.

3 Extract taken from The soul of man under socialism by Oscar Wilde (first published 1891).

4 Legal tender is a technical legal term that basically means an offer of payment that cannot be refused in settlement of a debt.

5 Fiat (Latin), “let it be done” meaning that these currencies are guaranteed by government decree only.

6 Milton Friedman pays homage to Ronald Reagan’s record on deregulation in an essay entitled “Freedom’s friend” published in the Wall Street Journal on June 11, 2004. Drawing evidence from The Federal Register, “records the thousands of detailed rules and regulations that federal agencies churn out in the course of a year”, Friedman contrasts Reagan’s record with that of Presidential incumbents before and since: “They [the rules and regulations] are not laws and yet they have the effect of laws and like laws impose costs and restrain activities. Here too, the period before President Reagan was one of galloping socialism. The Reagan years were ones of retreating socialism, and the post-Reagan years, of creeping socialism.” For socialism read regulation. http://online.wsj.com/news/articles/SB108691016978034663

7 Definition of “too big to fail” taken from Businessdictionary.com: “Idea that certain businesses are so important to the nation, that it would be disastrous if they were allowed to fail. This term is often applied to some of the nation’s largest banks, because if these banks were to fail, it could cause serious problems for the economy. By declaring a company too big to fail, however, it means that the government might be tempted to step in if this company gets into a bad situation, either due to problems within the company or problems from outside the company. While government bailouts or intervention might help the company survive, some opponents think that this is counterproductive, and simply helping a company that maybe should be allowed to fail. This concept was integral to the financial crisis of the late 2000s.”

8 According to IMF economic database for October 2010, World GDP is $61,963.429 billion (US dollars).

9 Unto This Last is based on a collection of four essays first published in the monthly Cornhill Magazine, 1860, and then reprinted as Unto This Last in 1862. This extract is drawn from his second essay: “The Veins of Wealth”

10 George Soros proudly explains the events of “Black Wednesday” on his official website: “In 1992, with the economy of the United Kingdom in recession, Quantum Fund’s managers anticipated that British authorities would be forced to break from the European Exchange Rate Mechanism (ERM) then in force and allow the British pound to devalue in relation to other currencies, in particular the German mark. Quantum Fund sold short (betting on a decline in value) more than $10 billion worth of pounds sterling. On September 16, 1992—later dubbed “Black Wednesday”—the British government abandoned the ERM and the pound was devalued by twenty percent.” http://www.georgesoros.com/faqs/archive/category/finance/

11Last year [1979] Bunker and his syndicate began buying silver again, this time on a truly gargantuan scale. They were soon imitated by other speculators shaken by international crises and distrustful of paper money. It was this that sent the price of silver from $6 per oz. in early 1979 to $50 per oz. in January of this year. Chairman Walter Hoving of Tiffany & Co., the famous jewelry store, was incensed. Tiffany ran an ad in the New York Times last week asserting: ‘We think it is unconscionable for anyone to hoard several billion, yes billion, dollars worth of silver and thus drive the price up so high that others must pay artificially high prices for articles made of silver from baby spoons to tea sets, as well as photographic film and other products.’” Extract taken from “He Has a Passion for Silver”, article published in Time Magazine, Monday 7April, 1980. http://content.time.com/time/magazine/article/0,9171,921964-2,00.html

12Many Government officials feared that if the Hunts were unable to meet all their debts, some Wall Street brokerage firms and some large banks might collapse.” Extract taken from “Bunker’s busted silver bubble”, article published in Time Magazine, Monday 12 May, 1980. http://content.time.com/time/magazine/article/0,9171,920875,00.html

13What may deal the Hunt fortune a fatal blow is the fallout from the brothers’ role in the great silver-price boom and bust of 1980. Thousands of investors who lost money in the debacle are suing the Hunts. On Saturday the brothers lost a civil case that could set an ominous precedent. A six-member federal jury in New York City found that the Hunts conspired to corner the silver market, and held them liable to pay $63 million in damages to Minpeco, a Peruvian mineral-marketing company that suffered heavy losses in the silver crash. Under federal antitrust law, the penalty is automatically tripled to $189 million, but after subtractions for previous settlements with Minpeco, the total value of the judgment against the Hunts is $134 million.” Extract taken from “Big bill for a bullion binge”, article published in Time Magazine, Monday 29 August, 1988. http://content.time.com/time/magazine/article/0,9171,968272-1,00.html

14 Extract also taken from the second essay, entitled: “The Veins of Wealth” of Unto This Last by John Ruskin.

15 Ibid.

Leave a comment

Filed under analysis & opinion, « finishing the rat race », financial derivatives, Max Keiser, neo-liberalism

welcome to the Panopticon: a potted history of mass surveillance

Two centuries ago:

In 1791, the Father of Utilitarianism and ardent social reformer Jeremy Bentham published blueprints for a wholly new design of prisons. Called the Panopticon, from observe (-opticon) all (pan-), the design, which involved a circular annulus of cells surrounding a central lodge, allowed the guards to keep an eye on all of the inmates, and importantly, without them, in turn, being aware of when they were being watched.

Bentham had big plans for his design, suggesting that aspects of the concept might usefully be applied to the construction of hospitals, schools and workhouses.

One century ago:

H.G. Wells was the father of a good many utopias. He spent the greater part of his creative life planning the shape of future societies. One of his most complete visions is laid out in a novel entitled simply A Modern Utopia (and published 1905). The story goes that two travellers walking in the Swiss Alps suddenly discover themselves in a parallel world. A new world that is Earth (at least geographically and biologically speaking) but one where civilisation has been reconstructed on altogether more Wellsian principles.

The inhabitants of this world are guaranteed housing, food and basic essentials. Even the unemployed are provided with a minimum wage, this safety net granted as “workfare” rather than “welfare”, with its recipients being coerced into work for the greater good of the state. In this vision of Wellsian meritocracy, the total measure of individual status depends solely upon earned income: the citizens of the new society regarding being broke as “clear evidence of unworthiness”.

Meanwhile criminal types and drug-users are given very short shrift. Removed from the main body of society and placed on high security prison islands, they are also sexually segregated to ensure that such poor genetic stock can never again pollute the otherwise healthy gene-pool.

Central to this alternative civilisation, the two explorers learn, there is a world-government (Wells never can resist the idea) made possible by a monumental database, with information stored on a card-index system housed in Paris. And Wells says that “Such a record is inevitable if a Modern Utopia is to be achieved.” But of course, what Wells could not foretell was how quickly technology would render the card-index system obsolete and make the establishment of such a global database entirely achievable.

Half a century ago:

It was 1948 when George Orwell settled into seclusion on the Isle of Jura, and there began to work on his most lasting contribution to literature and language. A little over a year passed before his terrifying vision of a future dystopia would be published, entitled simply Nineteen Eighty-Four.

Nineteen Eighty-Four isn’t merely gloomy, it is hellish in altogether more Orwellian ways. A one-party state, in which every member of Ingsoc (the Party) lives under close and constant scrutiny, watched on two-way telescreens, which are highly sensitive devices that can never be turned off. Casual conversations are eavesdropped, by friends just as surely as by strangers, and children are actively encouraged to snoop on their parents; enrolling with the juvenile troops of Spies rather than Scouts (often to the delight and pride of their own brainwashed parents).

There is absolutely no place for privacy in Nineteen Eighty-Four, certainly not for anyone in the Party, with the telescreens monitoring indoors, whilst outside, and aside from the hidden microphones, it is safe to presume that everyone is probably an informant. The Party has, however, less concern for minor dissent that may flare up within the lower ranks of ‘the proles’; the masses that it regards as so ignorant and intent on self-preservation as to pose no serious counter-revolutionary threat. Although even amongst the proles there stalks the ever-present menace of the Thought Police.

Orwell’s new world of dread was forged from the same ideological foundations as the just defeated axis of Fascism. It was a world divided by class, hatred and perpetual war. A world riven and driven by Power. And undoubtedly Orwell was in part presenting his critique of the post-war Soviet Union reconstructed under that other great dictator, Joseph Stalin, with his all-new formula for Communism. Indeed, on the basis of Orwell’s images of Big Brother, it’s fair to judge that this all-powerful leader of Ingsoc (the single party governing the new alliance of Oceania1) was a caricature of Stalin.

Aldous Huxley was Orwell’s old teacher, and in his own futurist satire Brave New World (published in 1932), had envisaged a world of shopping and leisure, founded upon gentle Pavlovian conditioning of eugenically perfected infants, made ready for the soft bed of a world constructed in accordance with Freud’s pleasure principle. In Brave New World, everyone is Dolly the Sheep, and so more forcible means of coercion have become a thing of the forgotten past. George Orwell wrote of his old teacher Huxley’s prophesy as follows:

Mr Aldous Huxley’s Brave New World was a good caricature of the hedonistic Utopia, the kind of thing that seemed possible and even imminent before Hitler appeared, but it had no relation to the actual future. What we are moving towards at this moment is something more like the Spanish Inquisition, and probably far worse, thanks to the radio and the secret police. There is very little chance of escaping it unless we can reinstate the belief in human brotherhood without the need for a ‘next world’ to give it meaning.”2

Of course, it has turned out to be more complicated than that. Stalin died, and the Eastern Bloc with its many citizen spies and Stasi Thought Police was eventually overthrown by resistance within as much as without. Aldous Huxley always maintained that all forms of brutal totalitarian oppression must eventually succumb to such internal pressures, being forced to give way to a different and softer kind of centralised control, and for a short time it seemed that he was correct. But then came September 11th and how quickly in its shadows, the jackboots came back on the ground. Stomping down on the face of humanity all across the world.

Since about a decade:

In January 2002, within the months following the September 11th attacks, the US Defense Department, under the umbrella of the Defense Advanced Research Projects Agency (DARPA), began to develop a vast surveillance project, requiring a database even beyond H.G. Wells’ imagining. Set up under the direction of Admiral John Poindexter – formerly Ronald Reagan’s National Security Advisor3 – the Information Awareness Office (IAO) was intended to serve the interests of “National Security”. Its aim was to establish methods of collecting and collating information of all kinds. Records of what an individual purchased, where they travelled, what they watched, and so on, whilst also incorporating information from public records on education and health. More covert snooping was also proposed as a necessary means of analysing internet use, emails, and faxs.

Other plans included the development of “human identification at distance systems” based on biometrics, which would obviate the current reliance on human operators to keep their eyes peeled. Combined with the ever extending network of CCTV, such a system could conceivably keep track of movements of the entire population. In a world soon to be filled with automated face-recognition systems or more probably – given recent technological developments – whole body scanners, it will be unnecessary for government authorities to force the people to carry forms of identity (or under more extreme tyranny, to wear badges), because it will become impossible to hide.

By February 2003, the IAO had begun funding what they called the Total Information Awareness (TIA) Program, although by May 2003 the program had already been renamed the Terrorism Information Awareness Program in an attempt to allay growing public anxiety of its Orwellian spectre. Then in August 2003, Poindexter was forced to resign as TIA chief with concerns that his central role in the Iran-Contra affair had made him unfit to run a sensitive intelligence program. Soon after this the IAO closed and officially the TIA program was terminated with all funding removed, yet it is widely acknowledged that the core of the project remains and that funding was merely switched to other government agencies.4

Finally, perhaps some indication of the true intent of these surveillance projects may be gleaned from the original IAO logo. Featuring a planetary-sized pyramid capped by an all-seeing eye that is scanning the entire Earth, the message is surely loud enough, especially when captioned with the motto “scientia est potentia” (knowledge is power). For what is this pyramid and the all-seeing eye meant to represent? That Big Brother is watching you? That you are already inside the Panopticon? Here was the official explanation of its meaning:

For the record, the IAO logo was designed to convey the mission of that office; i.e., to imagine, develop, apply, integrate, demonstrate, and transition information technologies, components, and prototype, closed-loop information systems that will counter asymmetric threats by achieving total information awareness useful for preemption, national security warning, and national security decision making. On an elemental level, the logo is the representation of the office acronym (IAO) the eye above the pyramid represents “I” the pyramid represents “A,” and the globe represents “O.” In the detail, the eye scans the globe for evidence of terrorist planning and is focused on the part of the world that was the source of the attacks on the World Trade Center and the Pentagon.”5

Meanwhile, British governments have also brought in rafts of new legislation to extend police powers and limit personal freedom. Indeed, the first major new Terrorism Act, which was introduced in 2000 (and thus prior to the September 11th attacks), actually redefined the meaning of terrorism in order to increase the scope for police intervention. Whilst the disconcertingly titled RIP Act, which quickly followed, further extended the rights for government to intercept communications and to patrol the internet. Then, during David Blunkett’s tenure as Home Secretary, the RIP Act (or RIPA) was broadened again, becoming so extensive that almost 800 separate organisations, including more than 450 councils, have the right to invoke it. People might now be snooped on right across the country for offences no more serious than littering and under-age smoking.6

In the aftermath of the London bombings of July 7th 2005, the New Labour governments under both Blair and Brown also pressed hard for an extension of police rights to detain terrorist suspects. What had begun with seven days, quickly progressed to three weeks, and then, at least in the government’s opinion, required not less than 90 days. The justification given for these extraordinary new measures – the worst of which were thankfully rejected by Parliament – being that plots of the most diabolical kind were suddenly so widespread and complex that the ordinary course of justice had to be by-passed in order to ensure public safety. Around the same time, the introduction of national ID cards was also thwarted, in part thanks to a massive public outcry. Nevertheless, the threat of terrorism (the real risk of which is far lower than during the days of IRA attacks) is the overriding justification for ever more surveillance of our public spaces and our personal lives.7

Throughout the last decade we have all been asked to give up our privacy and other civil liberties on the grounds of enhanced security: sacrificing freedom today for the sake of freedom tomorrow, which may well be, of course, a bargain with the devil. By the end of 2006, the United Kingdom was being described by some experts as ‘the most surveilled country’ among all industrialized Western nations.8

I heard someone speaking on Radio 4 a few years ago. Wrongly convicted for a crime he was later cleared of, he had as a direct consequence spent more than ten years of his life in prison. The interviewer asked him what his first thoughts were after being released as a free man. “Well, I was horrified,” he replied, “horrified that there were just as many cameras on the outside as inside. It was like I’d never left prison.”9

Now and the foreseeable future:

Under construction by contractors with top-secret clearances, the blandly named Utah Data Center is being built for the National Security Agency. A project of immense secrecy, it is the final piece in a complex puzzle assembled over the past decade. Its purpose: to intercept, decipher, analyze, and store vast swaths of the world’s communications as they zap down from satellites and zip through the underground and undersea cables of international, foreign, and domestic networks. The heavily fortified $2 billion center should be up and running in September 2013. Flowing through its servers and routers and stored in near-bottomless databases will be all forms of communication, including the complete contents of private emails, cell phone calls, and Google searches, as well as all sorts of personal data trails—parking receipts, travel itineraries, bookstore purchases, and other digital “pocket litter.” It is, in some measure, the realization of the “total information awareness” program created during the first term of the Bush administration—an effort that was killed by Congress in 2003 after it caused an outcry over its potential for invading Americans’ privacy10.

From an article entitled “The NSA is Building the Country’s Biggest Spy Center (Watch What You Say)” written by James Bamford, the author of The Shadow Factory: The Ultra-Secret NSA from 9/11 to the Eavesdropping on America. Published in Wired magazine on March 15th, Bamford continues:

For the first time, a former NSA official has gone on the record to describe the program, codenamed Stellar Wind, in detail. William Binney was a senior NSA crypto-mathematician largely responsible for automating the agency’s worldwide eavesdropping network. […]

He explains that the agency could have installed its tapping gear at the nation’s cable landing stations—the more than two dozen sites on the periphery of the US where fiber-optic cables come ashore. If it had taken that route, the NSA would have been able to limit its eavesdropping to just international communications, which at the time was all that was allowed under US law. Instead it chose to put the wiretapping rooms at key junction points throughout the country—large, windowless buildings known as switches—thus gaining access to not just international communications but also to most of the domestic traffic flowing through the US. […]

The eavesdropping on Americans doesn’t stop at the telecom switches. To capture satellite communications in and out of the US, the agency also monitors AT&T’s powerful earth stations, satellite receivers in locations that include Roaring Creek and Salt Creek. […]

Binney left the NSA in late 2001, shortly after the agency launched its warrantless-wiretapping program. “They violated the Constitution setting it up,” he says bluntly. “But they didn’t care. They were going to do it anyway, and they were going to crucify anyone who stood in the way. When they started violating the Constitution, I couldn’t stay.” Binney says Stellar Wind was far larger than has been publicly disclosed and included not just eavesdropping on domestic phone calls but the inspection of domestic email. At the outset the program recorded 320 million calls a day, he says, which represented about 73 to 80 percent of the total volume of the agency’s worldwide intercepts. The haul only grew from there. According to Binney—who has maintained close contact with agency employees until a few years ago—the taps in the secret rooms dotting the country are actually powered by highly sophisticated software programs that conduct “deep packet inspection,” examining Internet traffic as it passes through the 10-gigabit-per-second cables at the speed of light. […]

After he left the NSA, Binney suggested a system for monitoring people’s communications according to how closely they are connected to an initial target. The further away from the target—say you’re just an acquaintance of a friend of the target—the less the surveillance. But the agency rejected the idea, and, given the massive new storage facility in Utah, Binney suspects that it now simply collects everything. “The whole idea was, how do you manage 20 terabytes of intercept a minute?” he says. “The way we proposed was to distinguish between things you want and things you don’t want.” Instead, he adds, “they’re storing everything they gather.” And the agency is gathering as much as it can.

Once the communications are intercepted and stored, the data-mining begins. “You can watch everybody all the time with data- mining,” Binney says. Everything a person does becomes charted on a graph, “financial transactions or travel or anything,” he says. Thus, as data like bookstore receipts, bank statements, and commuter toll records flow in, the NSA is able to paint a more and more detailed picture of someone’s life.

Click here to read more of James Bamford’s eye-opening article, and then, here to read a still more extraordinary article published by Wired magazine on the very same day:

More and more personal and household devices are connecting to the internet, from your television to your car navigation systems to your light switches. CIA Director David Petraeus cannot wait to spy on you through them.

Earlier this month, Petraeus mused about the emergence of an “Internet of Things” — that is, wired devices — at a summit for In-Q-Tel, the CIA’s venture capital firm. “‘Transformational’ is an overused word, but I do believe it properly applies to these technologies,” Petraeus enthused, “particularly to their effect on clandestine tradecraft.”

All those new online devices are a treasure trove of data if you’re a “person of interest” to the spy community. Once upon a time, spies had to place a bug in your chandelier to hear your conversation. With the rise of the “smart home,” you’d be sending tagged, geolocated data that a spy agency can intercept in real time when you use the lighting app on your phone to adjust your living room’s ambiance.

Items of interest will be located, identified, monitored, and remotely controlled through technologies such as radio-frequency identification, sensor networks, tiny embedded servers, and energy harvesters — all connected to the next-generation internet using abundant, low-cost, and high-power computing,”11

Orwell, for all of his profound insight and prescience, could never have imagined the sort of universal networks of surveillance being so rapidly put in place today. He didn’t see, for instance, as Huxley might have done, how people would one day almost willingly give up their privacy, and not only as the price for security, but purely for convenience and pleasure. That personal tracking devices would one day become such highly desirable commodities, in the form of mobile phones and ‘sat nav’s, that it would actually be strange not to carry one. That social networking sites would be temptation enough for many millions to divulge huge volumes of personal information, private opinions, dreams and fantasies. That others would broadcast their thoughts via emails, tweets, blogs, and all could be swept up in a worldwide web. The worldwide wiretap, as Julian Assange referred to it.

This post is another part of the immense traffic of data presumably being collected and analysed by those at the NSA (and in all probability also filtered using servers at our own GCHQ). That you are reading this is most probably being recorded too. So feel free to add a comment, although you should be cautioned that whatever you do say may later be used as evidence against you. The Panopticon is watching all of us.

Click here to read a wikipedia overview of the types of mass surveillance now used in the United Kingdom and elsewhere.

*

Additional:

Here is a Russia Today report broadcast a few days later on Friday 30th March entitled: “Minority report: Era of total surveillance zooms-in on US?”

Click here to find the same report at Russia Today website.

As for Britain, and whatever the situation right now, the government is just about to announce new measures that will open the way for GCHQ to have “access to communications on demand, in real time” with the justification being, as always, “to investigate serious crime and terrorism and to protect the public”:

A new law – which may be announced in the forthcoming Queen’s Speech in May – would not allow GCHQ to access the content of emails, calls or messages without a warrant.

But it would enable intelligence officers to identify who an individual or group is in contact with, how often and for how long. They would also be able to see which websites someone had visited.

Click here to read the full BBC news report from April 1st.

1 The setting is roughly as follows. Some time after the World War, the world divided up into three warring superpowers: Oceania (previously America, Australia and Airstrip One); Eurasia (Russia and the rest of Europe); and Eastasia (China and India). These states have since then been engaged in an endless three-sided conflict, fighting to gain control of the resources in a disputed zone which includes North Africa and the Middle East. Progress in this conflict is reported to the citizens of Oceania via a government controlled media, relaying information manufactured by the Ministry of Truth.

2 Taken from “Notes on the way” by George Orwell, first published in Time and Tide. London, 1940.

3 Poindexter had been previously been convicted of lying to Congress and altering and destroying documents pertaining to the Iran-Contra Affair.

4 These include Advanced Research and Development Activity (ARDA), a part of the Disruptive Technology Office (run by to the Director of National Intelligence); and SAIC, run by former Defense and military officials and which had originally been awarded US$19 million IAO contract to build the prototype system in late 2002.

5 Statement of the Information Awareness Office regarding the meaning and use of the IAO logo. Source: Question 15 in the IAO Frequently Asked Questions – document dated February, 2003 which can be accessed at http://www.darpa.mil/iao/TIA_FAQs.pdf

6 The Regulation of Investigatory Powers Act 2000 (RIP or RIPA) regulates the powers of public bodies to carry out surveillance and investigation, especially with regard to the interception of communication. It can be invoked by government officials specified in the Act on the grounds of national security, and for the purposes of preventing or detecting crime, preventing disorder, public safety, protecting public health, or in the interests of the economic well-being of the United Kingdom.

“Councils have used laws designed to combat terrorism to access more than 900 people’s private phone and email records in the latest example of Britain’s growing surveillance state. Town hall spies found out who residents were phoning and emailing as they investigated such misdemeanours as dog quarantine breaches and unlicensed storage of petrol. The news prompted fresh calls from civil rights groups for a reform of the Regulation of Investigatory Powers Act (Ripa), which was originally brought in to combat terrorism and serious crime but is increasingly being used by councils to snoop on members of the public. In April a council in Dorset used Ripa powers to spy for weeks on a family it wrongly suspected of breaking rules on school catchment areas. Other local authorities have used covert surveillance to investigate such petty offences as dog fouling and under-age smoking.” extract from “Council snoopers access 900 phone bills” by Gordon Rayner, Chief Reporter, Daily Telegraph, 5th June 2008. http://www.telegraph.co.uk/news/2075026/Council-snoopers-access-900-phone-bills.html

7 “Deputy chief constable of Hampshire Ian Readhead said Britain could become a surveillance society with cameras on every street corner. He told the BBC‘s Politics Show that CCTV was being used in small towns and villages where crime rates were low… ‘If it’s in our villages, are we really moving towards an Orwellian situation where cameras are at every street corner?’

‘And I really don’t think that’s the kind of country that I want to live in.’ There are up to 4.2 million CCTV cameras in Britain – about one for every 14 people.” from BBC News, Sunday, 20th May 2007. http://news.bbc.co.uk/1/hi/uk/6673579.stm

8 “Produced by a group of academics called the Surveillance Studies Network, the [Surveillance Society] report was presented to the 28th International Data Protection and Privacy Commissioners’ Conference in London, hosted by the Information Commissioner’s Office. […]

The report’s co-writer Dr David Murakami-Wood told BBC News that, compared to other industrialised Western states, the UK was “the most surveilled country”.

“We have more CCTV cameras and we have looser laws on privacy and data protection,” he said.

“We really do have a society which is premised both on state secrecy and the state not giving up its supposed right to keep information under control while, at the same time, wanting to know as much as it can about us.”

The report coincides with the publication by the human rights group Privacy International of figures that suggest Britain is the worst Western democracy at protecting individual privacy.

The two worst countries in the 36-nation survey are Malaysia and China, and Britain is one of the bottom five with ‘endemic surveillance’.”

From a BBC news article entitled “Britain is ‘surveillance society’” published on November 2, 2006. http://news.bbc.co.uk/1/hi/uk/6108496.stm

9 Unfortunately, since I did not have pen at hand – I was driving at the time! – I can no longer recall his precise words and so I have been compelled to paraphrase what he said. I have tried to be accurate so far as memory serves me.

10 From an article entitled “The NSA is Building the Country’s Biggest Spy Center (Watch What You Say)” written by James Bamford, published in Wired magazine on March 15, 2012. http://www.wired.com/threatlevel/2012/03/ff_nsadatacenter/all/1

11 From an article entitled “CIA Chief: We’ll Spy on You Through Your Dishwasher”, written by Spencer Ackerman, published by Wired magazine on March 15, 2012. http://www.wired.com/dangerroom/2012/03/petraeus-tv-remote/

3 Comments

Filed under analysis & opinion, Britain, mass surveillance, Uncategorized, USA