Tag Archives: Richard Dawkins

aimless weather

The following article is Chapter One of a book entitled Finishing The Rat Race.

All previously uploaded chapters are available (in sequence) by following the link above or from category link in the main menu, where you will also find a table of contents and a preface on why I started writing it.

*

“Let no one enter here who does not have faith”

— Inscription over the door on Max Planck’s Laboratory

*

“In the beginning God created the heaven and the earth. And the earth was without form, and void; and darkness was upon the face of the deep. And the spirit of God moved upon the face of the waters. And God said, Let there be light: and there was light.”

These were the words spoken by the astronauts on board Apollo 8 once they had established a lunar orbit, thereby becoming the first humans ever to leave Earth fully behind them. As a literary choice, it was one that inevitably caused considerable irritation, and especially amongst atheists around the world.

Undoubtedly there was more than a little politics involved when it came to the Apollo astronauts making a decision to read passages from the Bible. Given that the Cold War face-off had provided such impetus for the entire space programme, having steadily beaten off the challenge of the godless Soviets, if nothing else the words transmitted a kind undiplomatic rebuke, redoubled when the Eagle landing module touched down just a few months later, and the astronauts’ first duty became to plant the Stars and Stripes in the pristine moon dust. Skipping about in delight, taking some holiday snaps and bringing home a basket full of moon-rocks no longer enough.

Not that I am trying to rain on anyone’s parade. Far from it. The Moon landing involving not merely a tremendous technical achievement but also a hell of a lot of guts. It was one moment when ordinary Americans had every reason to feel pride. Viewed in an alternative light, however, this towering and singular accomplishment was also the extraordinary end product of many centuries of truly international effort. A high point in a centuries long science and engineering project set in motion by pioneers like Galileo, Kepler and, of course, Newton, which only then culminated on July 20th 1969 with such a genuinely epoch-marking event that for many minutes the world collectively held its breath … 1

Apollo 8 had been just another of the more important reconnaissance missions necessary to lay the groundwork for the moon landing itself. Another small step that led directly to that most famous step in history (so far), although as a step, the Apollo 8 mission was also breathtaking in its own right. As for the grumbling about the transmission of passages from Genesis, well the inclusion of any kind of religious element seemed inappropriate to many. After all, science and religion are not supposed to mix, but on top of which, having gained seeming ascendancy why was Science suddenly playing second fiddle again?

Religion, as a great many of its opponents readily point out, is superstition writ largest. Science, by contrast, purposefully renounces the darkness of superstition, and operates solely by virtue of the assiduous application of logic and reason. Science and religion are therefore as incompatible as night and day, and so when it came to the cutting edge of space exploration, just what did the Bible have to do with any of it? Sir Isaac Newton was doing the driving, wasn’t he…?

On the other hand, and playing Devil’s Advocate, why not choose these words? After all, the circumstances rendered a strange appropriateness and charge to the plain vocabulary of Genesis: heaven and earth; void and darkness; the face of the waters. A description of the act of creation so understated, and yet evocative, that it’s hard to recall a more memorable paragraph in the whole literary canon, and few with greater economy. If the astronauts or NASA were endorsing the biblical story of creation that would have been another matter, of course, but here I think we can forgive the perceived faux pas – ‘one false step amidst a giant leap forward for mankind!’

My personal wish is that as Neil and Buzz were setting off to “where no man had gone before”, climbing into their Lunar Landing Module and sealing the air-lock behind them, they might forgetfully have left the flag behind to keep Michael Collins company. Leaving no signs of their extraordinary visit besides the landing section of the strange metal beetle they had flown in, and, beside it, their monumental, and somehow still astonishing, footprints.

*

Very occasionally I happen to meet intelligent and otherwise rational people who’ll made the claim that the biblical story of creation is broadly supported by the latest scientific discoveries. The universe began at a moment, they’ll explain, just as it is written. There then followed a succession of events, leading to the eventual rise of Man. All of this, they’ll insist, accurately checks out with the opening page of Genesis, whilst the theories of modern cosmology and evolutionary biology simply patch the occasional missing details. And truly, this is a desperate line of defence!

For there is no amount of creative Biblical accountancy – of interpreting days as epochs and so forth – which can successfully reconstruct the myth of Genesis in order to make it scientifically sound. The world just wasn’t created that way – wasn’t created at all, apparently – and creationism, which often claims to be an alternative theory, when it offers no theory at all, also fails to withstand the minutest degree of scrutiny. No, creationism survives merely on account of the blind and desperate faith of its adherents. Here indeed is how a modern cosmologist might have gone about rewriting the Biblical version (if by chance they had been on hand to lend God a little assistance):

“In the beginning God created a small but intense fireball. A universal atom into which space and time itself were intrinsically wrapped. As this primordial fireball very rapidly expanded and cooled, the fundamental particles of matter condensed out of its energetic froth, and by coalescence, formed into atoms of hydrogen, helium and lithium. All this passed in a few minutes.

Clouds of those original elements, collapsing under their own weight, then formed into the first stars. The loss of gravitational potential energy heating the gases in these proto-stars to sufficiently high temperatures (many millions of degrees) to trigger nuclear fusion. In the cores of such early giants, the atoms of hydrogen and helium were now just beginning to be fused into ever-heavier elements through a series of stages known as nucleosynthesis. Happily this fusion of smaller atoms into increasingly larger ones generated an abundance of energy. Enough to keep the core temperature of each star above a million degrees; hot enough to sustain the fusion of more and more atoms. So it was that the hydrogen begat helium, helium begat lithium, lithium begat beryllium and boron… And God saw that it was good.

After a few billion years had passed, these same stars, which had hitherto been in a state of hydrostatic balance – thermal and radiation pressure 2 together supporting the weight of the gases – were burning low on fuel. During this last stage, at the end of a long chain of exergonic 3 fusion reactions, atoms as large as iron were being created for the very first time. Beyond the production of iron, however, this nucleosynthesis into even heavier elements becomes energy exhaustive, and so the process of fusion could no longer remain self-sustaining. So it came to pass that the first generation stars were starting to die.

But these stars were not about to fizzle out like so many guttering candles. The final stage of their demise involved not a whimper, but bangs of unimaginable power. Beginning as a collapse, an accelerating collapse that would inevitably and catastrophically rebound, each star was torn apart within a few seconds, the remnants propelled at hyper-velocities out into deep space. And it was during these brief but almighty supernova explosions when the heavier elements (lead, gold and ultimately all the stable elements in the periodic table) came into being.

Ages came and passed. Pockets of the supernova debris, now drifting about in tenuous clouds, and enriched with those heavier elements, began to coalesce a second time: the influence of gravity rolling the dust into new stars. Our Sun is one star born not from that generation, but the next, being one of almost countless numbers of third generation of stars: our entire Solar System emerging indeed from a twice-processed aggregation of swirling supernova debris. All this had passed around 5 billion years ago, approximately 14 billion years after the birth of time itself.”

Now very obviously in this modern reworking there can be no Earth at the time of creation, so the story in Genesis fails to accord with the science right from its outset: from chapter one, verse one. For there is simply no room for the Earth when the whole universe is still smaller than a grapefruit.

I can already hear the protests of course: for Earth we must read Universe apparently, in order to make any meaningful comparison. Okay, so playing along, what then becomes of heaven? For God created both heaven and earth remember. Well, if heaven was once some place above our heads (as it surely was for people living under the stars at the time when Genesis was written) then to accord with the current theories of cosmology, perhaps those who still subscribe to the entire Biblical story imagine its existence as a parallel universe; linked through a wormhole we call death. Truly, the Lord works in mysterious ways!

 *

Some readers will doubtless baulk at the idea of God being the creator of anything, and yet I think we should honestly admit that nothing in modern cosmology with certainty precludes the existence of an original creative force; of God only as the primum mobile, the first-mover, igniting the primordial spark. Indeed, it may come as a surprise to discover (as it did for me) that one of the first proponents of the currently accepted scientific theory – now universally known as the Big Bang Theory – was by vocation a Roman Catholic priest.

Father Georges Lemaître, a Belgian professor of physics and astronomy, having quickly recognised the cosmological possibilities latent within Einstein’s then still novel theory of General Relativity, published his ‘hypothesis of the primeval atom’ in the prestigious scientific journal Nature as long ago as 1931. Yet interestingly, his ideas did not receive much support at the time, in part due to lack of evidence, but also because many contemporary physicists initially rejected all such theories of spontaneous universal origin as being an entirely religious import. But science isn’t built on belief, and so it can’t be held hostage to orthodoxy in the same way that religious conviction can. This is where science and religion absolutely depart. Although, in order to explore this further, it is first helpful to consider two vitally important though rather difficult questions: “what is science?” and “what is religion?”

*

I have a friend who tells me that science is the search for knowledge; an idea that fits very happily with the word’s etymology: from Latin scientia, meaning “to know”. Meanwhile the dictionary itself offers another useful definition: “a branch of knowledge conducted on objective principles involving the systematized observation of, and experiment with phenomena.” According to this more complete description, science is not any particular set of knowledge, but rather a system or systems that aim at objectivity.

Scientific facts exist, of course, but these are simply ideas that have been proved irrefutable. For instance, that the Earth is a ball that moves around the Sun. This is a fact and it is a scientific one. For the most part, however, scientists do not work with facts as straightforward as this. And rather than facts, the most common currency of working scientists is theories. Scientific theories are not to be believed in as such, but a means to encompass the best understanding available. They exist in order to be challenged, and thus to be improved upon.

In Science, belief begins and ends as follows: that some forms of investigation, by virtue of being objective, lead to better solutions than other, less objective approaches. This is the only orthodoxy to which all scientists are committed, and so, in the final analysis, being scientific means nothing more or less than an implicit refusal to admit knowledge aside from what can be observed and measured. For Science is an inherently empirical approach, with its prime directive and perhaps also its élan vital being that: in testing, we trust.

I could leave the question of science right there and move on to consider the question of religion, but before I do so, I would like to put one important matter straight. Whatever it is that science is and does, it also helps to understand that the majority of scientists rarely if ever consider this question.

As a physics undergraduate myself, I learnt quite literally nothing about the underlying philosophies of science (there was an addition module – a final year option – addressing this topic but unfortunately it was oversubscribed). Aside from this, I was never taught to analyse the empirical method in and of itself. I personally learnt absolutely nothing about hypotheses, let alone how to test them (and in case this should lead readers to think my university education was itself substandard, then let me also admit, at the risk of appearing an arrogant braggart, that I attended one of the best scientific academies in the country – Imperial College would no doubt say the best). Yet they did not teach us about hypotheses, and for the simple reason that the vast majority of physicists rarely bother their heads about them. Instead, the scientists I’ve known (and again, I was a research student for three years) do what might be broadly termed “investigations”.

An investigation is just that, and it might involve any variety of techniques and approaches. During the most exciting stages of the work, the adept scientist may indeed rely as much on guesswork and intuition as on academic training and logical reasoning. Famously, for example, the chemist August Kekulé dreamt up the structure of benzene in his sleep. Proving the dream correct obviously required a bit more work.

The task set for every research scientist is to find answers. Typically then, scientists are inclined to look upon the world as if it were a puzzle (the best puzzle available), and as with any other puzzle, the point is just to find a satisfactory solution.

So why then did I begin with talk of scientific methods? Well because, as with most puzzles, some methods will prove more efficacious than others, but also because in this case there is no answer to be found at the bottom of page 30 – so we’d better be as sure as we can, that the answer we find is the best available one. Which in turn means applying the best (i.e., most appropriate and reliable) methods at hand, or else developing still better ones.

By ‘method’, I do not mean simply whatever approach the scientist employs to test his or her own guesses about the puzzle, but just as importantly, a system that can be used to prove this solution to the satisfaction of a wider scientific community. For methods too are accepted only once they have been tried and tested.

So when the philosopher Karl Popper claims that the scientific method depends upon “testable hypotheses” (or as my friend calls them “detestable hypotheses”) I would say fair enough… but let’s not mistake this definition for a description of what scientists actually do. We may accept that science must make statements that can be falsified – this is indeed a useful “line of demarcation”, as Popper puts it 4 – and we can call these statements “testable hypotheses” if we choose – but science is simply about broadening and refining our knowledge and understanding, and any approach that is scientifically accountable will really do just fine.

*

So what of religion? Well, that’s a pricklier issue again, of course, so let me swerve clear of any direct answer for the moment so as to draw a further comparison with science.

Where a religious person may say, I have faith in such and such because it is written so, a scientist, assuming she is honest, ought only to say that “given the evidence we have thus far collected and collated, our best explanation is the following…” As more evidence becomes available, our scientist, assuming she has integrity (at least as a scientist), may humbly (or not) concur that her previously accepted best explanation is no longer satisfactory. In short, the scientist is always required by virtue of their profession to keep an open mind; the truth of their discipline being something that’s forever unfolding and producing facts that are rarely final.

For the religious-minded, however, the very opposite may apply, and for all who know that the true shape of things is already revealed to them through faith, there must be absolute restrictions to further open-minded inquiry. (Not that all religions stress the importance of such unassailable beliefs – some do not.)

Where it is the duty of every scientist to accept all genuine challenges, and to allow (as Richard Feynman once put it) for Nature to come out “the way she is”, it is the duty for many religious believers (though not, as I say, of all who are religious) to maintain a more rigidly fixed view of the world. Here again, however, it ought to be stressed that the scientist’s constant and single-minded aim for objectivity is not necessarily dependent on his or her lack of beliefs or subjective opinion – scientists are, after all, only human. So virtually all scientists come to their puzzles with preconceived hunches, and, whether determined by the head or the heart, have a preference for one solution over another. But this doesn’t much matter, so long as they are rigorous in their science.

Indeed, many of the most brilliant scientific minds have also held strongly religious convictions (Newton and Darwin spring immediately to mind). In studying that great work called Nature, Newton was implicitly trying to understand the mind of God, and finally Newton’s discoveries did not shatter his belief in God, but instead confirmed for him that there must be an intelligent agency at large, or at least one that set things initially in motion.             Darwin’s faith was more fundamentally rocked (as we shall see), yet he came to study Nature as another devout believer. But the art of the scientist in every case is to recognise such prejudices and put them to one side, and this is the original reason for developing such strict and rigorous methodologies. Ultimately, to reiterate, Science is no more or less powerful than its own methods for inquiry. Which is how it was that physicists and astronomers gradually put aside their reservations as the evidence grew in favour of Father Lemaître’s theory of creation.

So the lesson here is that whereas religion demands faith, science asks always for the allowance of doubt and uncertainty. And just as St Thomas asked to see the holes in Christ’s palms, so too every responsible scientist is called to do the same, day in and day out. Doubting Thomas should be a patron saint of all scientists.

*

I wish to change the subject. It is not my aim to pitch science against religion and pretend that science is somehow the victor, when in truth I regard this as a phoney war. On its own territory, which is within the bounds of what is observable and measurable, science must always win. This is inevitable. Those who still look for answers to scientific questions in the ancient writings of holy men are only deceiving themselves.

But science too has its boundaries, and, as the philosopher Ludwig Wittgenstein argued in his famous (if notoriously difficult) Tractatus Logico-Philosophicus – proceeding via an interwoven sequence of numbered and nested propositions and aphorisms to systematically unravel the complex relationship between language, thought and the world – rational inquiry, though our most promising guide for uncovering the facts of existence, can never be complete.

Just as the Universe apparently won’t allow us to capture every last drop of heat energy and make it do work for us, at least according to current thermodynamic theories, so Wittgenstein argued (to his own satisfaction and also to the exacting standards of Bertrand Russell) an analogous limitation applies to all systems of enquiry designed for capturing truth. Even the most elaborate engines in the world cannot be made 100% efficient, and likewise the most carefully constructed forms of philosophical investigation, even accepting Science as the most magnificent philosophical truth engine we shall ever devise (as Wittgenstein did 5), will inescapably be limited to that same extent – perfection in both cases being simply unattainable.

Many have racked their brains to think up the most cunning of contraptions, but none have invented a perpetual motion machine, and the same, according to Wittgenstein, goes for anyone wishing to generate any comprehensive theory of everything, which is just another human fantasy. 6 Most significantly and most controversially, Wittgenstein says that no method can be devised for securing any certain truths regarding ethics, aesthetics, or metaphysics, and that consequently all attempts at pure and detached philosophical talk of these vital matters is mere sophistry.

Having revealed the ultimate limitations to reasoning, Wittgenstein then arrives at his seventh, and perhaps most famous proposition in this most famous and celebrated of works. A stand-alone declaration: it is the metaphorical equivalent of slamming the book shut!

“What we cannot speak of we must pass over in silence.” 7, he says, suddenly permitting himself the licence of a poet.

This was his first and also last hurrah as a philosopher (or so he thought 8), Wittgenstein taking the lead from his own writings – and what greater measure of integrity for a philosopher than to live according to their own espoused principles. Ditching his blossoming career at Cambridge, he set out in pursuit of a simpler life back in his Austrian homeland, first (and somewhat disastrously) as a primary school teacher, and then more humbly as a gardener at a monastery. (Although at length, of course, he did famously return to Cambridge to resume and extend his “philosophical investigations”).

But isn’t this all just a redressing of much earlier ideas of scepticism? Well, Wittgenstein is quick to distance himself from such negative doctrines, for he was certainly not denying truth in all regards (and never would). But faced by our insurmountable limitations to knowledge, Wittgenstein is instead asking those who discuss philosophies beyond the natural sciences to intellectually pipe-down. Perhaps he speaks too boldly (some would say too arrogantly). Maybe he’s just missing the point that others more talented would have grasped, then stomping off in a huff. After all, he eventually turned tail in 1929, picking up where he’d left off in Cambridge, returning in part to criticise his own stumbling first attempt. But then what in philosophy was ever perfectly watertight?

The one thing he was constantly at pains to point out: that all philosophy is an activity and not, as others had believed, the golden road to any lasting doctrinal end. 9  And it’s not that Wittgenstein was really stamping his feet and saying “impossible!”, but rather that he was attempting to draw some necessary and useful boundaries. Trying to stake out where claims to philosophic truth legitimately begin and end. An enterprise perhaps most relevant to the natural sciences, an arena of especially precise investigation, and one where Wittgenstein’s guiding principle – that anything which can be usefully said may be said clearly or not at all – can be held as a fair measure against all theories. Indeed, I believe this insistence upon clarity provides a litmus test for claims of “scientific objectivity” from every field.

Embedded above is a film by Christopher Nupen entitled “The Language Of The New Music” about Ludwig Wittgenstein and Arnold Schonberg; two men whose lives and ideas run parallel in the development of Viennese radicalism. Both men emerged from the turmoil of the Habsburg Empire in its closing days with the idea of analysing language and purging it with critical intent, believing that in the analysis and purification of language lies the greatest hope that we have.

*

Let me return to the question of religion itself, not to inquire further into “what it is” (since religion takes many and varied forms, the nature of which we may return to later), but rather to ask more pragmatically “whether or not we are better with or without it”, in whatever form. A great many thinkers past and present have toyed with this question; a considerable few finding grounds to answer with a very resounding “without”.

In current times there has been no more outspoken advocate of banishing all religion than the biologist Richard Dawkins. Dawkins, who aside from being a scientist of unquestionable ability and achievement, is also an artful and lively writer; his books on neo-Darwinian evolutionary theory being just as clear and precise as they are wonderfully detailed and inspiring. He allows Nature to shine forth with her own brilliance, though never shirking descriptions of her darker ways. I’m very happy to say that I’ve learnt a great deal from reading Dawkins’ books and am grateful to him for that.

In his most famous (although by no means his best) book, The Selfish Gene, Dawkins set out to uncover the arena wherein the evolution of life is ultimately played out. After carefully considering a variety of hypotheses including competition between species, or the rivalry within groups and between individuals, he concludes that in all cases the real drama takes place at a lower, altogether more foundational level. Evolution, he explains, after a great deal of scrupulous evidential analysis, is driven by competition between fragments of DNA called genes, and these blind molecules care not one jot about anything or anyone. This is why the eponymous gene is so selfish (and Dawkins may perhaps have chosen his title a little more carefully, since those who haven’t read beyond the cover may wrongly presume that scientists have discovered the gene for selfishness, which is most certainly not the case). But I would like to save any further discussion about theories of biological evolution, and of how these have shaped our understanding of what life is (and hence what we are), for later chapters. Here instead I want to briefly consider Dawkins’ idea not of genes but “memes”.

*

In human society, Dawkins says in his final chapter of The Selfish Gene, change is effected far more rapidly by shifts in ideas rather than by those more steady shifts in our biology. So in order to understand our later development, he presents the notion of the parallel evolution between kinds of primal idea-fragments, which he calls “memes”. Memes that are most successful (i.e., the most widely promulgated) will, says Dawkins, like genes, possess particular qualities that increase their chances for survival and reproduction. In this case, memes that say “I am true so tell others” or more dangerously “destroy any opposition to my essential truth” are likely to do especially well in the overall field of competition. Indeed, says Dawkins, these sorts of memes have already spread and infected us like viruses.

For Dawkins, religious beliefs are some of the best examples of these successful selfish memes, persisting not because of any inherent truth, but simply because they have become wonderfully adapted for survival and transmission. His idea (a meme itself presumably) certainly isn’t hard science – in fact it’s all rather hand-waving stuff – but as a vaguely hand-waving response I’d have to admit that he has a partial point. Ideas that encourage self-satisfied proselytising are often spread more virulently than similar ideas that do not. Yet ideas also spread because they are just frankly better ideas, so how can Dawkins’ theory of memes bring this more positive reason into account? Can his same idea explain, for instance, why the ideas of science and liberal humanism have also spread so far and wide? Aren’t these merely other kinds of successful meme that have no special privilege above memes that encourage sun worship and blood sacrifice?

My feeling here is that Dawkins comes from the wrong direction. There is no rigorous theory for the evolution of memes, nor can there be, since there is no clearly discernible, let alone universal mechanism, behind the variation and selection of ideas. But then of course Dawkins knows this perfectly well and never attempts to make a serious case. So why does he mention memes at all?

Well, as an atheistic materialist, he obviously already knows the answer he wants. So this faux-theory of memes is just his damnedest attempt to ensure such a right result. Religion operating as a virus is an explanation that plainly satisfies him, and whilst his route to discovering that answer depends on altogether shaky methodology, he puts aside his otherwise impeccable scientific principles, and being driven to prove what in truth he only feels, he spins a theory backwards from a prejudice. What Dawkins and others have perhaps failed to recognise is that in the fullest sense, questions of religion – of why we are here, of why we suffer, of what makes a good life – will never be cracked by the sledgehammer of reason, for questions of value lie outside the bounds of scientific analysis. Or if he does recognise this, then the failing instead is to understand that there are many, quite different in temperament, who will always need attempted answers to these profound questions.

*

I didn’t grow up in a particularly religious environment. My mother had attended Sunday school, and there she’d learnt to trust in the idea of heaven and the eternal hereafter. It wasn’t hell-fire stuff and she was perfectly happy to keep her faith private. My father was more agnostic. He would probably now tell you that he was always an atheist but, in truth, and like many good atheists, he was actually an agnostic. The test of this is simple enough: the fact that he quite often admitted how nice it would be to have faith in something, although his own belief was just that Jesus was a good bloke and the world would be much nicer if people to tried to emulate him a bit. (Which is a Christian heresy, of course!)

I was lucky enough to attend a small primary school in a sleepy Shropshire village. Although it was a church school of sorts, religious instruction involved nothing more than the occasional edifying parable, various hymns, ancient and modern, and the Lord’s Prayer mumbled daily at the end of assembly. Not exactly what you’d call indoctrination. At secondary school, religious instruction became more formalised – one hour each week, presumably to satisfy state legislature. Then, as the years went by, our lessons in R.E. shifted from a purely Christian syllabus to one with more multicultural aspirations. So we learnt about Judaism, Islam, and even Sikhism, although thinking back I feel sure that our teacher must have delivered such alternative lessons through gritted teeth. I recall once how a classmate confused the creature on top of a Christmas tree with a fairy. Hark, how you should have heard her!

Being rather devout, this same teacher – a young, highly-strung, and staunchly virginal spinster – also set up a Christian Union club that she ran during the lunch hour, and for some reason I joined up. Perhaps it had to do with a school-friend telling me about Pascal’s wager: that you might as well believe in God since you stand to gain so much for the price of so small a stake. In any case, for a few weeks or months I tried to believe, or at least tried to discover precisely what it was that I was supposed to be believing in, though I quickly gave up. Indeed, the whole process actually made me hostile to religion. So for a time I would actively curse the God in the sky – test him out a bit – which proves only that I was believing in something.

Well to cut a long story short, whatever strain of religion I’d contracted, it was something that did affect me to a considerable extent in my late teens and early twenties. Of course, by then I regarded myself a fervent atheist, having concluded that “the big man in the sky” was nothing more or less than an ugly cultural artifact, something alien, someone else’s figment planted in my own imagination… and yet still I found that I had this God twitch.              Occasionally, and especially for some reason whilst on long journeys driving the car, I’d find myself ruminating on the possibility of his all-seeing eyes watching over me. So, by and by, I decided to make a totally conscious effort to free myself from this mind-patrolling spectre, snuffing out all thought of God whenever it arose. To pay no heed to it. And little by little the thought died off. God was dead, or at least a stupid idea of God, a graven image, and one I’d contracted in spite of such mild exposure to Christian teachings. A mind-shackle that was really no different from my many other contracted neuroses. Well, as I slowly expunged this chimera, I discovered another way to think about religion, although I hesitate to use such a grubby word – but what’s the choice?

Spirituality – yuck! It smacks of a cowardly cop-out to apply such a slippery alternative. A weasel word. A euphemism almost, to divert attention from mistakes of religions past and present. But are there any more tasteful alternatives? And likewise – though God is just such an unspeakably filthy word (especially when He bears an upper case G like a crown), what synonym can serve the same purpose? You see how difficult it is to talk of such things when much of the available vocabulary offends (and for some reason we encounter similar problems talking about death, defecation, sex and a hundred other things, though principally death, defecation and sex). So allow me to pass the baton to the greatly overlooked genius of William James, who had a far greater mastery over words than myself, and is a most elegant author on matters of the metaphysical.

*

“There is a notion in the air about us that religion is probably only an anachronism, a case of ‘survival’, an atavistic relapse into a mode of thought which humanity in its more enlightened examples has outgrown; and this notion our religious anthropologists at present do little to counteract. This view is so widespread at the present day that I must consider it with some explicitness before I pass to my own conclusions. Let me call it the ‘Survival theory’, for brevity’s sake.” 10

Here is James steadying himself before addressing his conclusions regarding The Varieties of Religious Experience. The twentieth century has just turned. Marx and Freud are beginning to call the tunes: Science, more broadly, in the ascendant. But I shall return to these themes later in the book, restricting myself here to James’ very cautiously considered inquiries into the nature of religion itself and why it can never be adequately replaced by scientific objectivity alone. He begins by comparing the religious outlook to the scientific outlook and by considering the differences between each:

The pivot round which the religious life, as we have traced it, revolves, is the interest of the individual in his private personal destiny. Religion, in short, is a monumental chapter in the history of human egotism… Science on the other hand, has ended by utterly repudiating the personal point of view. She catalogues her elements and records her laws indifferent as to what purpose may be shown by them, and constructs her theories quite careless of their bearing on human anxieties and fates… 11

This is such a significant disagreement, James argues, that it is easy to sympathise with the more objective approach guaranteed by hard-edged precision of science, and to dismiss religious attitudes altogether:

You see how natural it is, from this point of view, to treat religion as mere survival, for religion does in fact perpetuate the traditions of the most primeval thought. To coerce the spiritual powers, or to square them and get them on our side, was, during enormous tracts of time, the one great object in our dealings with the natural world. For our ancestors, dreams, hallucinations, revelations, and cock-and-bull stories were inextricably mixed with facts… How indeed could it be otherwise? The extraordinary value, for explanation and prevision, of those mathematical and mechanical modes of conception which science uses, was a result that could not possibly have been expected in advance. Weight, movement, velocity, direction, position, what thin, pallid, uninteresting ideas! How could the richer animistic aspects of Nature, the peculiarities and oddities that make phenomena picturesquely striking or expressive, fail to have been singled out and followed by philosophy as the more promising avenue to the knowledge of Nature’s life. 12

As true heirs to the scientific enlightenment, we are asked to abandon such primeval imaginings and, by a process of deanthropomorphization (to use James’ own deliberately cumbersome term), which focuses only on the precisely defined properties of the phenomenal world so carefully delineated by science, sever the private from the cosmic. James argues, however, that such enlightenment comes at a cost:

So long as we deal with the cosmic and the general, we deal only with the symbols of reality, but as soon as we deal with private and personal phenomena as such, we deal with realities in the completest sense of the term. 13

Thus, to entirely regard one’s life through the pure and impersonal lens of scientific inquiry is to see through a glass, not so much too darkly, as too impartially. Whilst being expected to leave out from our descriptions of the world “all the various feelings of the individual pinch of destiny, [and] all the various spiritual attitudes”, James compares with being offered “a printed bill of fare as the equivalent for a solid meal.” He expresses the point most succinctly saying:

It is impossible, in the present temper of the scientific imagination, to find in the driftings of cosmic atoms, whether they work on the universal or on the particular scale, anything but aimless weather, doing and undoing, achieving no proper history, and leaving no result.

This is the heart of the matter, and the reason James surmises, quite correctly in my opinion:

… That religion, occupying herself with personal destinies and keeping thus in contact with the only absolute realities which we know, must necessarily play an eternal part in human history. 14

*

Mauro Bergonzi, Professor of Religion and Philosophy in Naples, speaks about the utter simplicity of what is:

*

“I gotta tell you the truth folks,” comedian George Carlin says at the start of his most famous and entertaining rant, “I gotta tell you the truth. When it comes to bullshit – big-time, major league bullshit! You have to stand in awe of the all-time champion of false promises and exaggerated claims: Religion! Think about it! Religion has actually convinced people that there’s an invisible man! – living in the sky! –  who watches everything you do, every minute of every day…”

And he’s right. It’s bonkers but it’s true, and Carlin is simply reporting what many millions of people very piously believe. Sure, plenty of Christians, Muslims and Jews hold a more nuanced faith in their one God, and yet for vast multitudes of believers, this same God is nothing but a bigger, more powerful, humanoid. A father figure.

“Man created God in his own image,” is the way a friend once put it to me. And as a big man, this kind of a God inevitably has a big man’s needs.

Of course, the gods of most, if not all, traditions have been in the business of demanding offerings of one kind or another to be sacrificed before them, for what else are gods supposed to receive in way of remuneration for their services? It’s hardly surprising then that all three of the great Abrahamic faiths turn sacrifice into a central theme. But then what sacrifice can ever be enough for the one-and-only God who already has everything? Well, as George Carlin points out, God is generally on the lookout for cash:

“He’s all-powerful, all-perfect, all-knowing and all-wise, but somehow just can’t handle money!” But still, cash only goes so far. Greater sacrifices are also required, and, as the Old Testament story of Abraham and Isaac makes abundantly clear, on some occasions nothing less than human blood-sacrifice will do. 15 The implicit lesson of this story being that the love of our Lord God requires absolute obedience, nothing less. For ours is not to reason why…

“Oh, God you are so big!” the Monty Python prayer begins – bigness being reason enough to be awed into submission. But God also wants our devotion, and then more than this, he wants our love to be unconditional and undiluted. In short, he wants our immortal souls, even if for the meantime, he’ll settle for other lesser sacrifices in lieu.

As for the more caring Christian God (the OT God restyled), well here the idea of sacrifice is up-turned. The agonising death of his own son on Golgotha apparently satisfying enough to spare the rest of us. It’s an interesting twist, even if the idea of a sacrificed king is far from novel; by dividing his former wholeness, and then sacrificing one part of himself to secure the eternal favour of his other half is a neat trick.

But still, why the requirement for such a bloody sacrifice at all? Well, is it not inevitable that every almighty Lord of Creation must sooner or later get mixed up with the God of Death? For what in nature is more unassailable than Death; the most fearsome destroyer who ultimately smites all. Somehow this God Almighty must have control over everything and that obviously includes Death.

“The ‘omnipotent’ and ‘omniscient’ God of theology,” James once wrote in a letter, “I regard as a disease of the philosophy shop.” And here again I wholeheartedly agree with James. Why…? For all the reasons given above, and, perhaps more importantly, because any “one and only” infinitist belief cannot stand the test at all. Allow me to elucidate.

The world is full of evils; some of these are the evils of mankind, but certainly not all. So what sort of a God created amoebic dysentery, bowel cancer and the Ebola virus? And what God would allow the agonies of his floods, famines, earthquakes, fires and all his other wondrously conceived natural disasters? What God would design a universe of such suffering that he invented the parasitic wasps that sting their caterpillar hosts to leave them paralysed, laying their eggs inside so that their grubs will eat the living flesh?

The trouble is that any One True Lord, presuming this Lord is also of infinite goodness, needs, by necessity, a Devil to do his earthly bidding. This is unavoidable because without an evil counterpart such an infinite and omnipotent God, by virtue of holding absolute power over all creation, must thereby permit every evil in this world, whether man-made or entirely natural in origin. And though we may of course accept that human cruelties are a necessary part of the bargain for God’s gift of freewill – which is a questionable point in itself – we are still left to account for such evils as exist beyond the limited control of our species.

Thus, to escape the problem of blaming such “acts of God” on God himself, we may choose to blame the Devil instead for all our woes, yet this leads inexorably to an insoluble dilemma. For if the Devil is a wholly distinct and self-sustaining force we have simply divided God into two opposing halves (when He must be One), whereas if we accept that this Devil is just another of the many works of the One God, then the problem never really went away in the first place. For why would any omnipotent God first create and then permit the Devil to go about in his own evil ways? It is perhaps Epicurus who puts this whole matter most succinctly:

Is God willing to prevent evil, but not able? Then he is not omnipotent. Is he able, but not willing? Then he is malevolent. Is he both able and willing? Then whence cometh evil? Is he neither able nor willing? Then why call him God? 16

It is here that we enter the thorny theological “problem of evil”, although it might equally fittingly be called the “problem of pain”, for without pain, in all its various colourations, it is hard to imagine what actual form the evil itself could take.

So confronted by the Almighty One, we might very respectfully ask, “why pain?” Or if not why pain, as such, for conceivably this God may retort that without pain we would not appreciate joy, just as we could not measure the glory of day without the darkness of night, we still might ask: but why such excessive pain, and why so arbitrarily inflicted? For what level of ecstasy can ever justify all of Nature’s cruelties?

At this point, James unceremoniously severs the Gordian knot as follows: “… the only obvious escape from paradox here is to cut loose from the monistic assumption altogether, and allow the world to have existed from its origin in pluralistic form, as an aggregate or collection of higher and lower things and principles, rather than an absolutely unitary fact. For then evil would not need to be essential; it might be, and it may always have been, an independent portion that had no rational or absolute right to live with the rest, and which we might conceivably hope to see got rid of at last…”

*

There are many who have set out to find proof of God’s existence. Some have looked for evidence in archeology – the sunken cities of Sodom and Gomorrah, the preserved remains of Noah’s Ark, and most famously, the carbon dating of the Shroud of Turin – but again and again the trails lead cold. Others turned inwards. Searching for proof of God through reason. But this is surely the oldest mistake in the book. For whatever God could ever be proved by reason would undoubtedly shrivel up into a pointless kind of a God.

But there is also a comparable mistake to be made. It is repeated by all who still try, and after so many attempts have failed, to absolutely refute God’s existence. For God, even the Judeo-Christian-Islamic God, can in some more elusive sense, remain subtle enough to slip all the nets. He need not maintain the form of the big man in the sky, but can diffuse into an altogether more mysterious form of cosmic consciousness. In this more mystical form, with its emphasis on immediate apprehension, history also sinks into the background.

Dawkins and others who adhere to a strictly anti-religious view of the world are in the habit of disregarding these more subtle and tolerant religious attitudes. Fashioning arguments that whip up indignation in their largely irreligious audience, they focus on the rigid doctrines of fundamentalists. And obviously, they will never shake the pig-headed faith of such fundamentalists, but then neither will their appeals to scientific rationalism deflect many from holding more flexible and considered religious viewpoints. The reason for this is simple enough: that man (or, at least, most people) cannot live by bread alone. So, for the genuinely agnostic inquirer, strict atheism provides only an unsatisfactory existential escape hatch.

In the year 2000, the world-renowned theoretical physicist and mathematician Freeman Dyson won the Templeton Prize for Progress in Religion 17. In his acceptance speech he staked out the rightful position of religion as follows:

I am content to be one of the multitude of Christians who do not care much about the doctrine of the Trinity or the historical truth of the gospels. Both as a scientist and as a religious person, I am accustomed to living with uncertainty. Science is exciting because it is full of unsolved mysteries, and religion is exciting for the same reason. The greatest unsolved mysteries are the mysteries of our existence as conscious beings in a small corner of a vast universe. Why are we here? Does the universe have a purpose? Whence comes our knowledge of good and evil? These mysteries, and a hundred others like them, are beyond the reach of science. They lie on the other side of the border, within the jurisdiction of religion.

So the origins of science and religion are the same; he says, adding a little later:

Science and religion are two windows that people look through, trying to understand the big universe outside, trying to understand why we are here. The two windows give different views, but they look out at the same universe. Both views are one-sided; neither is complete. Both leave out essential features of the real world. And both are worthy of respect.

Trouble arises when either science or religion claims universal jurisdiction, when either religious dogma or scientific dogma claims to be infallible. Religious creationists and scientific materialists are equally dogmatic and insensitive. By their arrogance they bring both science and religion into disrepute. 18

By restoring mystery to its proper place at the centre of our lives, Dyson’s uncertainty might indeed offer the possibility for actual religious progress. It might achieve something that the purer atheism almost certainly never will. Hallelujah and amen!

*

Once upon a time I was an atheist too, only slowly coming to realise that being so sure-footed about the inessential non-spirituality of existence requires an element of faith of its own. It requires a faith in the ultimate non-mystery of the material universe. That everything is, in principle at least, fathomable. Not that this means our atheistic scientific worldview must inevitably be duller nor that it automatically considers life less wonderful. Not at all. Life and the rest of it may appear to be just as aimless as weather, to steal James’ choice metaphor, but this has a kind of beauty of its own, as many an atheist will affirm. And there’s security of a different, some would say higher form, in the acceptance and affirmation of perfectly aimless existence. It can feel like a weight lifted.

Yet, the rarely admitted truth is that the carriers of the scientific light of reason (of whom I remain very much one) are just as uncertain as the average Joe Churchgoer about what might loosely be termed the supernatural (or supranatural) – by which I mean both the ultimately unknowable, and also, whatever strange and various events still remain unexplained by our accepted laws of the natural world. All of which stands to reason: the inexplicable lying, by its very definition, outside the province of science, whilst, at the same time, a bristling realisation that the universe is inherently and intractably mysterious stirs unconsciously at the back of all our minds, even those of the most logical and rational of thinkers. For the stark truth is that existence itself is spooky! And consequently, scientists too are sometimes afraid of the dark.

Finally then, the practising scientist, putting aside all questions of ultimate meaning or purpose, for these concerns are beyond the scope of their professional inquiries, must admit that they sideline such matters only on the grounds of expedience. The only useful scientific questions being ones that can be meticulously framed. So whilst science is necessarily dispassionate and preoccupied with material facts, it does not follow that being scientific means to mistake the world as revealed by science for the scientific model that approximates it – any model of the universe being, at best, obviously a pale approximation to the true complexity of the original.

Scientists then are not the new high priests and priestesses of our times, because their role is cast quite differently. Gazing downwards rather than upwards, to earth rather than heaven, they pick away at the apparently lesser details in the hope of unravelling the bigger picture. Turning outwards instead of inwards, deliberately avoiding subjective interpretations in favour of tests and measurements, they seek to avoid opinion and to rise above prejudice. All of this requires a kind of modesty, or should.

But there is also a fake religion, one that dresses itself in the brilliant white of laboratory coats. It pleads that the only true way to understanding is a scientific one, disavowing all alternatives to its own rational authority. Of course such claims to absolute authority are no less fraudulent than claims of papal infallibility or the divine right of kings, but true devotees to the new religion are blind to such comparisons. More importantly, they fail to see that all claims to an exclusive understanding, whether resting on the doctrines of religion or by the microscopic scrutiny of science, aside from being false claims, necessarily involves a diminution of life itself. That at its most extreme, this new religion of scientific materialism leads unswervingly to what William Blake called “the sleep of Newton”: a mindfulness only to what can be measured and calculated. And truly this requires a tremendous sacrifice.

*

James Tunney, LLM, is an Irish Barrister who has lectured on legal matters throughout the world. He is also a poet, visual artist, and author of “The Mystical Accord: Sutras to Suit Our Times, Lines for Spiritual Evolution”. In addition, he has written two dystopian novels – “Blue Lies September”, and “Ireland I Don’t Recognize Who She Is”. Here he speaks with host of “New Thinking Allowed”, Jeffrey Mishlove about the ‘Perennial Philosophy’ tradition found in cultures throughout the world, for which the essential core tenet is mysticism. What is meant by mysticism is discussed at length, and as Tunney explains, one important characteristic shared by all mystical traditions is the primary recognition of humans (and animals) as spiritual beings. Thus, scientism as a cultural force, by virtue of its absolutist materialist dogma, is necessarily antagonistic to all forms of mysticism:

*

So by degrees I’ve been converted back to agnostism, for all its shamefulness. Agnosticism meaning “without knowledge”. I really have no idea whether or not a god of any useful description exists, nor even whether this is a reasonable question, yet I can still confidently rule out many of his supposed manifestations (especially those where his name is top-heavy with its illuminated capital G). But any detailed speculation on the nature of god or, if you prefer, the spiritual, is what William James calls “passing to the limit”, and in passing that limit we come to what James called the “over-beliefs”.

Over-beliefs are the prime religious currency in which churches do the bulk of their business. They are what most distinguish the Lutherans from the Catholics; the Sunnis from the Shias; and more schismatically again, the Christians from the Muslims. All the carefully formulated dogma about the Holy Trinity, the Immaculate Conception, the virgin birth; the sacraments and the catechisms; and the ways of invocation of the One True God; or in more Easterly traditions, the karmic cycle and the various means and modes of reincarnation, and so on and so forth, all are over-beliefs, for they attempt to cross the threshold from “the sensible and merely understandable world” to “the hither side”. In his own conclusions, James suggested a more “pluralistic hypothesis” to square the varieties of religious experience:

Meanwhile the practical needs and experiences of religion seem to me sufficiently met by the belief that beyond each man and in a fashion continuous with him there exists a larger power which is friendly to him and to his ideals. All that the facts require is that the power should be other and larger than our conscious selves. Anything larger will do, if only it be large enough to trust for the next step. It need not be infinite, it need not be solitary. It might conceivably even be only a larger and more godlike self, of which the present self would then be but the mutilated expression, and the universe might conceivably be a collection of such selves, of different degrees of inclusiveness, with no absolute unity realized in it at all…

These are James’ overbeliefs and they broadly concur with my own. Though mine have also been tinted a little by Eastern hues. Intuitively I am drawn by the Taoist notion of the constant flux of eternal becoming. An unnameable current of creation with an effortless strength like the strength of water, which is subtle, flexible and unstoppable. Accordingly, my intuition respects the Taoist directive to flow effortlessly with this eternal current, for there is no sense in swimming against it. And this is a philosophy that compliments well the mindfulness of Zen (or Ch’an), with its playful seriousness, its snapping fingers calling the wandering attention back to the here and now. I can easily empathise with the Zen student’s search for the raw nakedness of naked existence, with its requirement to the strip all veils of presumed understanding; focusing upon where the outer and inner worlds reflect, to achieve a spontaneous but ineffable awakening. I can see it as a potentiality, and it does not jar against the hard-won rationality of my scientific training. In contrast to so much of the declarative wiseacring of Western philosophy, mastery of both disciplines is all about knowing when to shut up. As mythologist Joseph Campbell, author of The Hero with a Thousand Faces, once said:

God is a thought, God is an idea, but its reference is to something that transcends all thinking. I mean, he’s beyond being, beyond the category of being or nonbeing. Is he or is he not? Neither is nor is not. Every god, every mythology, every religion, is true in this sense: it is true as metaphorical of the human and cosmic mystery. He who thinks he knows doesn’t know. He who knows that he doesn’t know, knows. 19

I am not of course a Taoist nor a Buddhist of any kind. I am unaffiliated to any church. But I am drawn to Taoism and Zen Buddhism because of their appeals to objectivity, with emphasis on revelation above and beyond belief. For in neither Taoism nor Zen is any shape of God decreed or delineated: God being as much a zero and a one. And as a one-nothing, or a no-thing, this no-God requires no sacrifice, no high calls to blind obedience; for the Universe is as the Universe does. Yet something of the religious remains, beyond the purely philosophical, a something that strict atheism lacks: a personal role within the cosmic drama, which escapes the absurd chance and purposeless drifting of materialist scientism. 20

So it is that I choose to adopt them to an extent. To draw on their philosophies, and to marry these on again with ideas found in strands of Western Existentialism, to aspects of liberal humanism and to the better parts of Christianity (distilled in the songs of Blake, for instance). But whilst it may be edifying to pick the best from traditions of both East and West, to satisfy my god-shaped hole, I see too that such a pick-and-mix approach is prone to make as many false turns as any traditional religious route – it is interesting to note here that the word “heretic” derives from the Greek hairetikos, meaning “able to choose”. For there are no actual boundaries here. So what of the many shamanic traditions and tribal gods of primitive society? What about our own pagan heritage? Isn’t it time to get out the crystals and stuff some candles in my ears? Mesmerised by a hotchpotch of half-comprehended ideas and beliefs, just where are the safeguards preventing any freewheeling religious adventurer from falling into a woolly-headed New Ageism?

Well, it’s not for me or anyone else to call the tune. Live and let live – everyone should be entitled to march to the beat of their own drums, always taking care not to trample the toes of others in the process. But this idea of the New Age is a funny business, and I wish to save my thoughts on that (perhaps for another book). Meanwhile, my sole defence against charges of constructing a pick-and-mix religion is this: if you’d lost your keys where would you look for them? In your pocket? Down at your feet? Only under the streetlights? Oh, you have your keys – well then, good for you! Now, please don’t expect everyone else to stop looking around for theirs, or restricted to searching only under the most immediate and convenient lamppost.

Having said all this, and rather shamefully spoken too much on matters that better deserve silence, it now behoves me to add that I am certainly careful when it comes to choosing between personal over-beliefs, adhering to one rule: that what is discredited by steadfast and rigorous scientific trial is guaranteed baloney. Miracles, of course, are quite out of the question, failing on account of their own self-defining impossibility. Equally I have no time for animalistic gods of any persuasion, whether or not they share a human face. But my deepest distrust is not of religions per se (since, to repeat, these are many and varied in form, and then good and bad in parts), but more specifically, for the seemingly numberless religious organs we call creeds, sects, churches and so on.

To contend that religion is always about power is to miss the bigger picture, as I hope I’ve satisfactorily shown, and yet… It would be wise for the sheep to beware the shepherd. This much agreed, however, I feel sure that religion, in some wiser form, still has an important role to play in many of our individual lives and for the sake of all our futures. You may be surprised to learn that George Orwell thought similarly, and made his opinion felt in his essay Notes on the Way (an essay which, at intervals, I shall return to later):

… Marx’s famous saying that ‘religion is the opium of the people’ is habitually wrenched out of its context and given a meaning subtly but appreciably different from the one he gave it. Marx did not say, at any rate in that place, that religion is merely a dope handed out from above; he said that it is something the people create for themselves to supply a need that he recognized to be a real one. ‘Religion is the sigh of the soul in a soulless world. Religion is the opium of the people.’ What is he saying except that man does not live by bread alone, that hatred is not enough, that a world worth living in cannot be founded on ‘realism’ and machine-guns? If he had foreseen how great his intellectual influence would be, perhaps he would have said it more often and more loudly. 21

Next chapter…

*

Addendum: mind over matter

Physicists speak about a ‘quantum theory’ but when asked what the physical reality this ‘theory’ describes is truly like, they have no useful or consistent answers at all. It works, they say, and at a mathematical level is the most precise ‘theory’ so far devised, so “shut up and calculate!” Or, if you prefer (with apologies to Shelley): look upon our quantum works and do not despair… certainly not about any gaps in our understanding about the true nature of reality that may or may not underlie it. This non-philosophical culture was the norm by the time I went to university; an opinion that was seldom if ever challenged and thus easily instilled.

Of course, quantum reality does come as a shock at first. I had genuinely felt an acute anxiety on first hearing of Schrödinger’s poor cat forever half-dead in her box. Not that we ever learnt about the famous thought experiment in class of course: no, physics abandoned Schrödinger’s cat to her interminable state of limbo long ago. Any underlying ontology was reading for pleasure only; a late-night topic for post-pub discussions.

But physics is mistaken in its beliefs. It has mixed up its modern ignorance with ultimate incomprehensibility. Schrödinger’s cat was actually meant to shock us all: most importantly, to wake up all those physicists who chose to interpret the abstraction as the world itself and decide without proof that nothing of reality exists beyond it. But we have incorporated the semi-corporeal cat into the mix of quantum oddities: as evidence of our unreal reality when the whole point was that such quantum half-death is absurd.

Moreover, what physicists today describe as ‘quantum theory’ is not strictly a theory at all but actually just a powerful predictive recipe and an engineering tool, whereas a genuine theory is yet to be written: the true quest for it is disguised by language again, because this potential future theory is what physicists currently sideline under the label ‘interpretations’ – as if they don’t much matter.

Professor of Philosopher at NYU, Tim Maudlin, explains the problem with quantum theory today and how the foundations of quantum mechanics should be understood (please ignore the perturbing observable in the background!):

Although the notion that consciousness plays a key role in quantum mechanics was seriously considered by many of the scientific luminaries of the early Twentieth Century including John von Neumann who discussed its salient role in his treatise The Mathematical Foundations of Quantum Mechanics, such interpretations have since fallen mostly out of favour (certainly amongst physicists). More recent empirical findings are however just beginning to challenge this scientific orthodoxy and may indeed rock the assertion that there is an inherent distinction between what I above called “quantum choice” and our conscious choice. In fact in contradiction to what I originally wrote, some of the latest studies are producing results that show astonishingly high correlation between conscious intention and the so-called “collapse” of the wave function.

The last word (of this chapter – not the subject!) I shall leave to Freeman Dyson:

I cannot help but think that the awareness of our brains has something to do with the process that we call “observation” in atomic physics. That is to say, I think our consciousness is not just a passive epiphenomenon carried along by the chemical events in our brains, but is an active agent forcing the molecular complexes to make choices between one quantum state and another. In other words, mind is already inherent in every electron, and the processes of human consciousness differ only in degree but not in kind from the processes of choice between quantum states which we call ‘chance’ when they are made by electrons.

*

Please note that for the purposes of ‘publishing’ here I have taken advantage of the option to incorporate hypertext links and embed videos – in order to distinguish additional commentary from the original text all newly incorporated text has been italised.

*

1    Not quite true actually. Apparently my father was one of a small number who decided not to bother watching the first men step onto the moon’s surface. He tells me that he was so sure they would make it, he couldn’t see the point. My mother watched, and apparently so did I, although still not two years old. I can’t say that I remember anything about the moment, and probably found it a lot less interesting than Bill and Ben The Flowerpot Men, but perhaps it affected me on some deeper level — could it be that seeing the first moon landing at such a tender age was part of the reason I ended up studying comets?

2    Radiation pressure is the consequence of light itself (photons) having momentum.

3    A process that releases energy to the surroundings in the form of work as opposed to endergonic, which means energy consuming. These terms are closely related to exothermic and endothermic, where energy release and absorption take the form of heat transfer.

4    Karl Popper’s precise “line of demarcation” was that, if any theory can be shown to be falsifiable, then it can usefully be described as scientific.

5

“The totality of true propositions is the whole of natural science (or the whole corpus of the natural sciences).”

— Wittgenstein, Tractatus Logico-Philosophicus, 4.11

6

“The whole modern conception of the world is founded on the illusion that the so-called laws of nature are the explanations of natural phenomena. Thus people today stop at the laws of nature, treating them as something inviolable, just as God and Fate were treated in past ages. And in fact both were right and both wrong; though the view of the ancients is clearer insofar as they have an acknowledged terminus, while the modern system tries to make it look as if everything were explained.” — Wittgenstein, Tractatus Logico-Philosophicus, 6.371-2.

7    In German: “Wovon man nicht sprechen kann, darüber muß man schweigen.”

8    “the truth of the thoughts that are here communicated seems to me unassailable and definitive.” Taken from the preface to the Tractatus Logico-Philosophicus.

9    In this first treatise of Wittgenstein (which was the only one he ever published – his later philosophy contained in “The Philosophical Investigations” being published posthumously), he begins with the totally unsupported and deeply contentious assertion that, in effect, all meaningful language involves a description, or more correctly a depiction, of fact. This follows because the use of all language involves a correlation between objects in the world and names for those objects. This is his so-called “picture theory of language” which requires, Wittgenstein claims, a one-to-one correspondence between names and objects. This given, he demonstrates that if any proposition is to be genuine it must have a definite sense, or to put it differently, for a statement to admit to any test of proof then it must at least be possible for that question to be set out absolutely clearly. For Wittgenstein this means that questions about ethics, aesthetics and theology fall outside the realm of philosophy; the reason being that they rely on words such as “goodness”, “beauty”, “truth” and “god” which have no clear one-to-one correspondence. Wittgenstein of course later changed his mind on some of this. Recognising that his picture theory was overly simplistic he returned to philosophy with a radically new idea. That the meaning of language is contained in its social usage, thereby reassigning the work of philosophers to the study of language within its natural social environment. The purpose of philosophy was now to untie the knots of these so-called “language games”. But it is easy to mistake him here – and many do – his notion being that science can properly be understood and appraised only by those who know its language, religion likewise, and so on. And not that all inquiry is merely a matter of “playing with words”.

10  The Varieties of Religious Experience: a Study in Human Nature by William James, Longmans, Green & co, 1902; from a lecture series.

11  Ibid.

12  Ibid.

13  Ibid. Italics maintained from the original source.

14  Ibid. James earlier says, “It is absurd for science to say that the egotistic elements of experience should be suppressed. The axis of reality runs solely through the egotistic places, – they are strung upon it like so many beads.”

15  Genesis Ch.22 tells how God commanded Abraham to go to the land of Moriah and to there offer up his own son Isaac as a sacrifice. The patriarch travels three days until finally he comes to the mountain, just as God had instructed, and there he tells his servant to remain until he and Isaac have ascended the mountain. Isaac, who is given the task of carrying the wood on which he will soon be sacrificed, repeatedly asks his father why there is no animal for the burnt offering. On each occasion, Abraham says that God will provide one. Finally, as Abraham draws his knife and prepares to slaughter his son, an angel stops him. Happily, a ram has been provided and it can now be sacrificed in place of Isaac.

16  This is sometimes called “the riddle of Epicurus” or “the Epicurean Paradox” even though Epicurus did not in fact leave behind any written record of this statement. The first record of it appears some four hundred or more years after and in a work by the early Christian writer Lactantius who is actually criticising the argument.

17  Freeman Dyson is undoubtedly one of the greatest scientists never to win the Nobel Prize. However, he was awarded the Lorentz Medal in 1966 and Max Planck medal in 1969. In March 2000 he was also awarded the Templeton Prize. Created in 1972 by the investor, Sir John Templeton, in an attempt to remedy what he saw as an oversight by the Nobel Prizes, which do not honour the discipline of religion. Previous Templeton Prize recipients have included the Rev. Dr. Billy Graham, Aleksandr Solzhenitsyn, Charles Colson, Ian Barbour, Paul Davies, physicist Carl Friedrich von Weizsacker, and Mother Teresa.

18  Extracts from Freeman Dyson ‘s acceptance speech for the award of the Templeton Prize, delivered on May 16, 2000 at the Washington National Cathedral.

19 From an interview conducted in 1987 by American journalist Bill Moyers as six-part series of conversations with Joseph Campbell entitled Joseph Campbell and the Power of Myth. The quote is taken from Episode 2, ‘The Message of the Myth’ broadcast on June 26, 1988. The full transcript is available here: https://billmoyers.com/content/ep-2-joseph-campbell-and-the-power-of-myth-the-message-of-the-myth/  

20  It is even tempting to envisage some grand union of these two ancient Chinese philosophies, called Zow!-ism perhaps.

21  Extract taken from Notes on the Way by George Orwell, first published in Time and Tide. London, 1940.

Leave a comment

Filed under « finishing the rat race »

the stuff of dreams

The following article is Chapter Two of a book entitled Finishing The Rat Race.

All previously uploaded chapters are available (in sequence) by following the link above or from category link in the main menu, where you will also find a table of contents and a preface on why I started writing it.

*

Oats and beans and barley grow,

Oats and beans and barley grow,

Do you or I or anyone know,

How oats and beans and barley grow?

— Traditional children’s rhyme

*

One of my earliest memories at school was being told that rabbits became quick to escape foxes, and likewise, foxes had become quicker to catch rabbits. This, the teacher said, is how one type of animal can slowly change into a new type through a process known as evolution. Well, I didn’t believe that for a minute. Such dramatic outcomes from such unremarkable causes. And why, I wondered, would something change simply because it had to – having to isn’t any reason.

Of course in many ways my teacher had missed the point (though in fairness, perhaps it was I who missed his point, off in a daydream, or curiously intent on the inconstant fluttering of a leaf against the window, or otherwise lost to the innocent pleasures of childhood reveries). Either way it doesn’t matter much. Importantly, my teacher had done his job – and done it well! He had planted a seed, which made this a most valuable lesson. But in his necessarily simplified account of evolution there was a flaw (and his version would by virtue of necessity have been a simple one, because however much I may have been distracted, the subtleties of evolution were beyond the grasp of our young minds). For what he had missed out is not why the rabbits became faster but how. The question being what “adaptive mechanism” could have driven any useful sequence of changes we might call ‘evolution’. And this is really the key point. Leaving out mention of any kind of adaptive mechanism, he was leaving open all sorts of possibilities. For instance, Lamarckism and Darwinism, though both theories of evolution, paint very different accounts of how life has developed, for they presume quite different adaptive mechanisms. I will try to explain the matter more carefully and in terms of giraffes.

*

You might ask a great many questions about giraffes. For instance, how on earth their extraordinary and striking markings could ever provide useful camouflage, though if you’re ever lucky enough to see one step almost invisibly out of dappled foliage into full light, you will certainly be sure that the effect is near perfect. Alternatively, you might ask why it is that they walk with both legs on the same side moving together. A very elegant form of locomotion. However, by far and away the most frequently asked question about giraffes is this: why do they have such long necks?

Well, here’s what Lamarck would have said. Giraffes began as ordinary antelope. Some of the antelope preferred grass and others preferred leaves. The ones that preferred leaves had an advantage if they could reach higher. To achieve this they would stretch their necks a little longer. As a direct result of acquiring this new characteristic, the foals of those slightly longer necked antelope would be also be born with slightly longer necks. They too would stretch that little bit higher. Over generations some types of the antelope would develop extremely long necks and the descendants of these eventually developed into a new species called giraffes.

The basis for Lamarck’s reasoning lies in a perfectly rational misunderstanding about genetics. He assumes that the “acquired characteristics” (i.e., those characteristics developed or acquired during life) of the parents will somehow be passed through to their offspring. It turns out however that this isn’t actually the case. He might have guessed as much I suppose. One of the oft-cited criticisms against Lamarck’s theory has been the case of Jewish boys. Why, his opponents would ask, do they ever grow foreskins in the first place?

Darwin offered an alternative hypothesis. Perhaps it goes like this, he thought: there are already differences within the population of antelope; some will have shorter necks than others to start with. Or in other words, there is already a “natural variation”. In times of plenty this may not be of significance, but in times of scarcity it could be that the antelope with longer necks have a slight advantage. This idea of course applies to any antelopes with other accidentally favourable characteristics, for example those that run faster, are better camouflaged, or have more efficient digestive systems; but let’s not go there – let’s stick to necks for a moment. The longer necked adults can reach higher and so get to those few extra leaves that will help them to survive. Having a slightly higher chance of survival means (all other factors being equal) that they are more likely to pass on their characteristics. Within a few generations there will be an inevitable increase in the population of the long-necked variety until eventually, the long-necked population might plausibly have evolved into a separate species.

What had Darwin achieved in this alternative explanation? Well, he had abolished any requirement for an heredity that depended on the transmission of “acquired characteristics.” He’d not entirely proved Lamarck wrong but only shown his ideas aren’t necessary. And although in actual fact Darwin never acknowledged Lamarck’s contribution, purely in terms of theories of heredity his own version was little better than Lamarck’s (basically, by introducing the equally flawed concept of pangenes he had finally got around the issue of Jewish foreskins). But it is not what Darwin had undermined, so much as what he had set up, that preserves his legacy. That the true driving force of evolution depends on variation and competition, in dynamic relationship that he called “natural selection”.

According to Darwin’s new vision then, the evolution of species depends upon how individuals within that species interact with their environment. Those that are best adapted will survive longer and pass on their winning characteristics, and the rest will perish without reproducing. In short, it is “the survival of the fittest” that ensures evolutionary progress; though this catchy summary was not Darwin’s own, but one that Darwin slowly adopted. (It was actually first coined by the philosopher Herbert Spencer, whose ideas I wish to return to later.)

*

Darwin still attracts a lot of criticism and much of this criticism comes from religious sections intent on promulgating the view that “it was God what done it all” –  the Creationists who refuse to acknowledge any of the overwhelming evidence whether from zoology, botany, geology, palaeontology, or embryology; rejecting reason in deference to “the word of God”. However, there are also more considered critiques.

Perhaps the most interesting of these is that Darwin’s evolutionary theory of natural selection is unscientific because it is founded on a tautology. It is after all self-evident that the fittest will survive, given that by fitness you must really be meaning “fitness for survival”. After all, it has to be admitted that sloths have survived, and in what sense can a sloth be said to “be fit” other than in its undoubted fitness to be a sloth. The assumption then is that Darwin’s idea of natural selection has added nothing that wasn’t already glaring obvious. Yet this is an unfair dismissal.

Firstly, it is unfair, because as I have said above, “the survival of the fittest” is Spenser’s contribution – one that leads rapidly into dangerous waters – but it is also unfair because it misses the way in which Darwin’s hypothesis is not only predictive, but also (as Karl Popper was so keenly aware) testable. If Darwin’s theory was a mere tautology then nothing on earth could ever disprove his claims, and yet there is room here for evidence that might truly test his theory to destruction.

How? Well, Darwin, it must be understood, had put forward a theory of gradual adaptation, so there is no accounting for any sudden leaps within his slowly branching history of life – so if, for instance, a complex new order of species suddenly arose in the fossil record without ancestry, then Darwin’s theory would need a radical rethink. Or let’s say some fossil was found with characteristics uncommon to any discovered ancestor. Here again Darwin’s theory would be seriously challenged. On the other hand, embryologists might discover discrepancies in the way eggs develop, and likewise, following the discovery of DNA and advent of modern genetics, we might find sudden abrupt shifts in the patterns of genes between species instead of gradual changes. Each of these cases would powerful evidence to challenge Darwinian theory.

But, instead of this (at least until now), these wide and varied disciplines have heaped up the supporting evidence. For example, people used to talk a lot about “the missing link”, by which they generally meant the missing link between humans and apes when scientists have in fact discovered a whole host of “missing links” in the guise of close cousins from the Neanderthals to the strange and more ancient australopithecines. For more exciting missing links, how about the fact that the jaw bone of reptiles exists in four parts and that three of those bones have slowly evolved in humans to form parts of the inner ear. How do we know? Well, there is evidence in the development of mammalian and reptilian embryos and more recently the discovery of an intermediate creature in which the bones were clearly used concomitantly for both chewing and listening. This is one of many discovered creatures that Darwin’s theory has predicted – whilst the most famous is surely the bird-lizard known as Archaeopteryx. Where, by way of comparison, are the remains of, say, Noah’s Ark?

But Darwin’s theory was not correct in all details. As I have already mentioned, his notion of pangenes was in some ways little better than Lamarck’s theory of acquired characteristics, and so it is perhaps still more remarkable that whilst he looked through a wonky glass, what he gleaned was broadly correct. Although, surprisingly perhaps, it took a monk (and one trained in physics more than in biology) to begin setting the glass properly straight. Enter Gregor Mendel.

Richard Dawkins shows how whales evolved from a cloven-hoofed ancestor, and reveals whales’ closest modern-day cousin:

*

If we think back to what people knew about the world (scientifically speaking) prior to the turn of the twentieth century, it seems astonishing what was about to be discovered within just a few decades. For instance, back in 1900 physicists were still in dispute about the existence of atoms, and meanwhile, astronomers were as then unaware of the existence of independent galaxies beyond the Milky Way. But then, in 1905, Einstein suddenly published three extraordinary papers. In the least well known of these, he proved mathematically how the jiggling Brownian motion of pollen grains on water (observed by Robert Brown almost a hundred years earlier) was caused by collisions of water molecules, and doing this, he finally validated the concept of matter being formed out of particles, and so by extension, thereby proven the existence of atoms, which finally settled a debate regarding the nature of matter that had begun more than two thousand years earlier in Greece.

Moreover, it wasn’t until the early 1920s, when Edwin Hubble (now better known as the father of the idea of the expanding universe) had succeeded in resolving the outer parts of other galaxies (previously called nebulae), detecting within their composition the collections of billions of individual stars. At last we knew that there were other galaxies just like our own Milky Way.

So in just twenty years, our universe had simultaneously grown and shrunk by a great many orders of magnitude. Nowadays, of course, we know that atoms are themselves composed of smaller particles: electrons, protons and neutrons, which are in turn fashioned from quarks 1; while the galaxies above and beyond congregate within further clusters (the Milky Way being one of the so-called Local Group, which is surely the most understated name for any known object in the whole of science).

The universe we have discovered is structured in multiple layers – though the boundaries between these layers are only boundaries of incomprehension. Looking upwards we encounter objects inconceivably large are in turn the building blocks of objects much larger again, whilst investigating the finest details of the particle world, we’ve learnt how little fleas have ever smaller fleas…

Our first stabs at understanding the origins of the trillions of galaxies in our visible universe, and of comprehending the nature of the matter and energy that comprises them, has lead to speculations based upon solid empirical findings that allow us to construct models of how the physical universe as a whole may have begun. Thus, via a joint collaboration between physicists searching on the macro- and micro-scales, we have finished up with the study of cosmology; the rigorous scientific study of the cosmos no less! (And to most physicists working at the turn of the twentieth century, the idea of a branch of physics solely devoted to the understanding of creation would surely have seemed like pure science fiction). I hope my digression has helped to set the scene a little…

*

Around the turn of the twentieth century, there also remained a mystery surrounding the science of heredity and the origin of genes. It was of course common sense that children tended to have characteristics reminiscent of their parents, but in precisely what manner those parental characteristics were hybridised had remained a matter of tremendous speculation. It was still widely believed that some kind of fluid-like mingling of genes occurred, little substantial scientific progress having been made on the older ideas about bloodlines.

But those early theories of blended inheritance, which imagined the infusing together of the two gene pools, as two liquids might mix, were mistaken. If genes really behaved this way then surely the characteristics of people would also blend together. Just as we add hot water to cold to make it warm, so a white man and a black woman would surely together procreate medium brown infants, becoming darker or lighter by generations depending on whether further black or white genes were added. Which is indeed true, up to a point, but it is not strictly true. And if it really were so simple, then the range of human characteristics might (as some racial purists had feared) gradually blend to uniformity. But the real truth about inheritance, as Mendel was quietly discovering during the middle of the 19th century, is that genes have an altogether more intriguing method of combination.

*

Mendel was a monk, who aside from observing the everyday monastic duties also taught natural science, principally physics. The work that eventually made him world-renowned, however, involved studies on peas; this was Mendel’s hobby.

He spent many years cross-fertilising varieties and making detailed observations of the succeeding generations. He compared the height of plants. He compared the positioning of flowers and pods on the stem. And he noted subtle differences in shape and colour of seeds, pods and flowers. By comparing generations, Mendel found that offspring showed traits of their parents in predictable ratios. More surprisingly, he noticed that a trait lost in one generation might suddenly re-emerge in the next. So he devised a theory to explain his findings. Like a great many scientific theories, it was ingenious in its simplicity.

Within every organism, he said, genes for each inheritable trait must occur not individually, but in pairs, and in such a way that each of these “gene-pairs” is either “dominant” or “recessive” to its partner. In this way, a gene could sometimes be expressed in the individual whilst in different circumstances it might lay dormant for a generation. But please allow me a brief paragraph to explain this modern concept of inheritance more completely and coherently.

The usual way to explain Mendelian Inheritance is in terms of human eye colours. It goes like this: There is one gene for eye colour, but two gene types. These are called “alleles”, meaning “each other”. In this case, one allele produces brown eyes (let’s call this Br), and the other produces blue eyes (Bl). You inherit one of these gene types from your mother and one from your father. So let’s say you get a brown allele from each. That means you have Br-Br and will have brown eyes. Alternatively you may get a blue allele from each, and then you’ll have Bl-Bl and so have blue eyes. So far so simple. But let’s say you get a brown from one parent and a blue from the other. What happens then? Well, Mendel says, they don’t mix, and produce green eyes or something, but that one of the genes, the brown one as it happens, will be “dominant”, which means you will have brown eyes. But here’s the interesting bit, since although you have brown eyes you will nevertheless carry an allele for blue eyes – the “recessive” allele. Now let’s say you happen to meet a beautiful brown-eyed girl, who is also carrying the combined Br-Bl genes. What will your beautiful children look like? Well, all things being equal in terms of gene combination – so assuming that you are both equally likely to contribute a Bl allele as a Br allele (i.e., that this is a purely random event) then there are only four equal possibilities: Br-Br, Br-Bl, Bl-Br, or Bl-Bl. The first three of these pairs will produce dominant brown, whilst the two recessive Bl alleles in the last pair produce blue. So if you happen to have four children, then statistically speaking, you are most like to produce three with honey brown eyes, and one imbued with eyes like sapphires. And the milkman need never have been involved.

Mendel had realised that instead of the old fashioned “analogue” system, in which our genes added together in some kind of satisfactory proportions – like two voices forming a new harmony – genes actually mix in an altogether more “digital” fashion, where sometimes, the gene type is on and sometimes it is off. Inevitably, the full truth is more complicated than this, with alleles for different genes sometimes combining in other ways, which will indeed lead to blending of some kinds of inherited traits. Yet even here, it is not the genes (in the form of the alleles) that are blended, but only the “expressed characteristics” of that pair of alleles – something called the phenotype. Thus, for generation after generation these gene types are merely shuffled and passed on. Indeed the genes themselves have a kind of immortality, constantly surviving, just as the bits and bytes in computer code are unaltered in reproductions. Of course, errors in their copying do eventually occur (and we now know that it is precisely such accidental “mutations” which, by adding increased variety to the gene pool, have served to accelerate the process of evolution). 2

Mendel’s inspired work was somehow lost to science for nearly half a century, and so although he was a contemporary of Darwin and knew of Darwin’s theory – indeed, Mendel owned a German translation of “On the Origin of Species”, in which he had underlined many passages – there is absolutely no reason to suppose that Darwin knew anything at all of Mendel’s ideas.

*

When Mendel’s papers were finally recovered in 1900, they helped set in motion a search for a molecular solution to the question of biological inheritance; a search that would eventually lead to Crick and Watson’s dawning realisation that the structure of DNA must take the form of an intertwined double-helix. Such an extraordinary molecule could peel apart and reform identical copies of itself. DNA, the immortal coil, the self-replicating molecule that lay behind all the reproductive processes of life, sent biologists (not least Crick and Watson) into whirls of excitement. It was 1953 and here was the biological equivalent to Rutherford’s momentous discovery of an inner structure to atoms, almost half a century earlier. Here was the founding of yet another new science. Whilst nuclear and particle physicists were finding more powerful ways to break matter apart, biologists would soon begin dissecting genes.

Aside from the direct consequences of current and future developments in biotechnology (a subject I touch on in the addendum below), the rapid developments in the field of genetics, have led to another significant outcome, for biologists have also slowly been proving Darwin’s basic hypothesis. Genes really do adapt from one species to another – and we are beginning to see just precisely how. Yet in complete disregard to the mounting evidence, evolutionary theory still comes under more ferocious attack than any other established theory in science. Why does Darwinism generate such furore amongst orthodox religious groups compared say to today’s equally challenging theories of modern geology? Why aren’t creationists so eager to find the fault with the field of Plate Tectonics? (Pardon the pun.) For here is a science in its comparative infancy – only formulated in the 1960s – that no less resolutely undermines the Biblical time-scale for creation, and yet it reaps no comparable pious fury. Rocks just aren’t that interesting apparently, whereas, anyone with the temerity to suggest that human beings quite literally evolved from apes… boy, did that take some courage! 3

*

Now at last, I will get to my main point, which is this: given that the question of our true origins has now been formally settled, what are we to conclude and what are the consequences to be? Or put another way, what’s the significance of discovering that just a million years ago – a heartbeat when gauged against the estimated four billion years of the full history of life on Earth – our own ancestors branched off to form a distinct new species of ape?

Well, first and foremost, I think we ought to be clear on the fact that being such relative terrestrial latecomers gives us no grounds for special pleading. We are not in fact perched atop the highest branch of some great evolutionary tree, or put differently, all creation was not somehow waiting on our tardy arrival. After all, if evolution is blind and not goal-orientated, as Darwinism proposes, then all avenues must be equally valid, even those that were never taken. So it follows that all creatures must be evolutionarily equal. Apes, dogs, cats, ants, beetles (which Darwin during his own Christian youth had noted God’s special fondness for, if judged only by their prodigious profusion), slugs, trees, lettuces, mushrooms, and even viruses; his theory makes no preference. All life has developed in parallel, and every species that is alive today, evolved from the same evolutionary roots and over the same duration simply to reach the tips of different branches. The only hierarchy here is a hierarchy of succession – of the living over the dead.

In short then, Darwinism teaches that we are just part of the great nexus of life, and no more central or paramount than our planet is central to the universe. To claim otherwise is to be unscientific, and, as Richard Dawkins has pointed out, depends entirely upon anthropocentrism and the “conceit of hindsight”.

Darwin too, quietly recognised that his theory provided no justification for any such pride in human supremacy. Likewise, he refused to draw any clear distinction between human races, correctly recognising all as a single species; an admission that says much for his intellectual courage and honesty, challenging as it did, his otherwise deeply conservative beliefs. For Darwin was a Victorian Englishman, and although not a tremendously bigoted one, it must have been hard for him to accept, that amongst many other things, his own theory of evolution meant that all races of men were of equal birth.

*

But if we agree that humans are a specialised kind of ape, then we need to be fair in all respects. We have got in a habit of presuming that mankind, or homo sapiens – “the wise man” – to apply our own vainglorious scientific denomination – of all the countless species on Earth, is the special one. Unique because, as it used often to be claimed, we alone developed the skill to use tools. Or because we have a unique capacity for complex communication. Or because we are unparalleled creators of wonderful music and poetry. Or because we are just supremely great thinkers – analytical to the point of seeking a meaning in the existence of existence itself. Or more simply, because we are self-aware, whereas most animals seem childishly oblivious even to their own reflected images. Or, most currently fashionable, because as a species we are uniquely sophisticated in an entirely cultural sense – that is, we pass on complex patterns of behaviour to one-another like no other critters.

All of our uniqueness, we owe, so it goes, to the extraordinary grey matter between our ears, with everything boiling down eventually to this: we are special because we are such brainy creatures – the cleverest around. But think about it: how can we actually be sure even in this conviction? For what solid proof have we that no other creatures on Earth can match our intellectual prowess?

Well, we might think to look immediately to brain size, but there’s a catch, as it turns out that bigger animals have bigger brain-needs merely to function. Breathing, regulating blood temperature, coping with sensory input, and so on, all require more neural processing the larger a creature becomes. So we must factor this into our equations, or else, to cite a singular example, we must concede that we are much dumber than elephants.

Okay then, let’s divide the weight of a brain by the weight of the animal it belongs to. We might even give this ratio an impressive label such as “the encephalisation quotient” or whatever. Right then, having recalibrated accordingly, we can repeat the measures and get somewhat better results this time round. Here goes: river dolphins have an EQ of 1.5; gorillas 1.76; chimpanzees 2.48; bottlenose dolphins 5.6; and humans an altogether more impressive 7.4. So proof at last that we’re streets ahead of the rest of life’s grazers. But hang on a minute, can we really trust such an arbitrary calculus? Take, for example, the case of fatter humans. Obviously they must have a lower average EQ than their thinner counterparts. So this means fatter people are stupider?

No, measurements of EQ might better be regarded as an altogether rougher indication of intelligence: a method to sort the sheep from the apes. But then, can you actually imagine for a minute, that if say, EQ gave higher results for dolphins than humans, we would ever have adopted it as a yardstick in the first place? Would we not have more likely concluded that there must be something else we’d overlooked besides body-mass? The fact that dolphins live in water and so don’t need to waste so much brain energy when standing still, or some such. For if we weren’t top of the class then we’d be sure to find that our method was flawed – and this becomes a problem when you’re trying to be rigorously scientific. So either we need more refinement in our tests for animal intelligence, with emphasis placed on being fully objective, or else we must concede that intelligence is too subtle a thing even to be usefully defined, let alone accurately scored.

However, a more bullish approach to our claims of greatness goes as follows: look around, do you see any other creatures that can manipulate their environment to such astonishing effects? None has developed the means to generate heat or refrigeration, to make medicines, or to adapt to survive in the most inhospitable of realms, or any of our other monumental achievements. Dolphins have no super-aqua equipment for exploring on land, let alone rockets to carry them to the Sea of Tranquility. Chimpanzees have never written sonnets or symphonies – and never will no matter how infinite the availability of typewriters. So the final proof of our superiority then is this, whether we call it intelligence or give it any other endorsement: technological achievement, artistic awareness, and imagination of every kind.

But what then of our very early ancestors, those living even before the rise of Cro-magnon 4, and that first great renaissance which happened more than 40,000 years ago. Cro-magnon people made tools, wore clothes, lived in huts, and painted the wonderful murals at Lascaux in France and at Altamira in Spain. They did things that are strikingly similar to the kinds of things that humans still do today. Homo sapiens of earlier times than these, however, left behind no comparable human artifacts, and yet, physiologically-speaking, were little different from you or I. Given their seeming lack of cultural development then, do we have justification for believing them intellectually inferior, or could it be that they simply exercised their wondrous imaginations in more ephemeral ways?

Or let’s take whales, as another example. Whales, once feared and loathed as little more than gigantic fish, are nowadays given a special privilege. Promoted to the ranks of the highly intelligent (after humans obviously), we have mostly stopped brutalising them. Some of us have gone further again, not merely recognising them as emotionally aware and uncommonly sensitive creatures, but ‘communing with them’. Swimming with dolphins is nowadays rated as one of the must-have life experiences along with white-water rafting and bungee jumping. So somehow, and in spite of the fact that whales have never mastered the ability to control or manipulate anything much – tool-use being a tricky business, of course, if you’re stuck with flippers – nevertheless, whales have joined an elite class: the “almost human”. We have managed to see beyond their unbridgeable lack of dexterity, because whales satisfy a great many of our other supposedly defining human abilities – ones that I outlined above.

Dolphins, we learn, can recognise their own reflections. And they use sounds, equivalent to names, as a way to distinguish one another – so do they gossip? How very anthropomorphic of me to ask! Also, and in common with many other species of cetaceans, they sing, or at least communicate by means of something we hear as song. Indeed, quite recent research based on information theory has been revealing; mathematical analysis of the song of the humpbacked whale indicates that it may be astonishingly rich in informational content – so presumably then they do gossip! And not only that, but humpbacked whales (and others of the larger whale species) share a special kind of neural cell with humans, called spindle cells. So might we gradually discover that humpbacked whales are equally as smart as humans? Oh come, come – let’s not get too carried away!

*

Do you remember a story about the little boy who fell into a zoo enclosure, whereupon he was rescued and nursed by one of the gorillas? It was all filmed, and not once but twice in fact – on different occasions and involving different gorillas, Jambo 5 and Binti Jua. 6 After these events, some in the scientific community sought to discount the evidence of their own eyes (even though others who’d worked closely with great apes saw nothing which surprised them at all). The gorillas in question, these experts asserted, evidently mistook the human child for a baby gorilla. Stupidity rather than empathy explained the whole thing. 7

Scientists are rightly cautious, of course, when attributing human motives and feelings to explain animal behaviour, however, strict denial of parallels which precludes all recognition of motives and feelings aside from those of humans becomes reductio ad absurdum. Such an overemphasis on the avoidance of anthropomorphism is no measure of objectivity and leads us just as assuredly to willful blindness as naïve sentimentality can. Indeed, to arrogantly presume that our closest evolutionary relatives, with whom we share the vast bulk of our DNA, are so utterly different that we must deny the most straightforward evidence of complex feelings and emotions reflects very badly upon us.

But then why stop with the apes? Dolphins are notoriously good at rescuing stranded swimmers, and if it wasn’t so terribly anthropomorphising I’d be tempted to say that they sometimes seem to go out of their way to help. Could it be that they find us intriguing, or perhaps laughable, or even pathetic (possibly in both senses)? – Adrift in the sea and barely able to flap around. “Why do humans decide to strand themselves?” they may legitimately wonder.

Dogs too display all the signs of liking us, or fearing us, and, at other times, of experiencing pleasure and pain, so here again what justification do those same scientists have to assume their expressions are mere simulacra? And do the birds really sing solely to attract potential mates and to guard their territory? Is the ecstatic trilling of the lark nothing more than a pre-programmed reflex? Here is what the eminent Dutch psychologist, primatologist and ethologist, Frans B.M. de Waal, has to say:

“I’ve argued that many of what philosophers call moral sentiments can be seen in other species. In chimpanzees and other animals, you see examples of sympathy, empathy, reciprocity, a willingness to follow social rules. Dogs are a good example of a species that have and obey social rules; that’s why we like them so much, even though they’re large carnivores.” 8

Here’s an entertaining youtube clip showing how goats too sometimes like to have a good time:

Rather than investigating the ample evidence of animal emotions, for too long the scientific view has been focused on the other end of the telescope. So we’ve had the behaviourists figuring that if dogs can be conditioned to salivate to the sound of bells then maybe children can be similarly trained, even to the extent of learning such unnecessary facts and skills (at least from a survival point of view) as history and algebra. Whilst more recently, with the behaviourists having exited the main stage (bells ringing loudly behind) a new wave of evolutionary psychologists has entered, and research is on-going; a search for genetic propensities for all traits from homosexuality and obesity, to anger and delinquency. Yes, genes for even the most evidently social problems, such as criminality, are being earnestly sought after, so desperate is the need of some to prove we too are nothing more than complex reflex machines; dumb robots governed by our gene-creators, much as Davros operates the controls of the Daleks. In these ways we have demoted our own species to the same base level as the supposedly automata beasts.

Moreover, simply to regard every non-human animal as a being without sentience is scientifically unfounded. If anything it is indeed based on a ‘religious’ prejudice; one derived either directly from orthodox faith, or as a distorted refraction via our modern faith in humanism. But it is also a prejudice that leads inexorably into a philosophical pickle, inspiring us to draw equally dopey mechanical caricatures of ourselves.

*

So what is Darwin’s final legacy? Well, that of course remains unclear, and though it is established that his conjectured mechanism for the development and diversity of species is broadly correct, this is no reason to believe that the whole debate is completely done and dusted. And since Darwin’s theory of evolution has an in-built bearing on our relationship to the natural world, and by interpolation, to ourselves, we would be wise to recognise its limitations.

Darwinism offers satisfactory explanations to a great many questions. How animals became camouflaged. Why they took to mimicry. What causes peacocks to grow such fabulous tails – or at least why their fabulous tails grow so prodigiously large. It also helps us to understand a certain amount of animal behaviour. Why male fish more often look after the young than males of other phylum. Why cuckoos lay their eggs in the nests of other birds. And why the creatures that produce the largest broods are most often the worst parents.

Darwinism also makes a good account of a wide range of complex and sophisticated human emotions. It copes admirably with nearly all of the seven deadly sins. Gluttony, wrath, avarice and lust present no problems at all. Sloth is a little trickier, though once we understand the benefits of conserving energy, it soon fits into place, whilst envy presumably encourages us to strive harder. Pride is perhaps the hardest to fathom, since it involves an object of affection that hardly needs inventing, at least from a Darwinian perspective. But I wish to leave aside questions of selfhood for later.

So much for the vices then, but what of the virtues. How, for example are Darwinians able to account for rise of more altruistic behaviour? And for Darwinian purists, altruism arrives as a bit of a hot potato. Not that altruism is a problem in and of itself, for this is most assuredly not the case. Acts of altruism between related individuals are to be expected. Mothers that did not carry genes to make them devoted toward their own children would be less likely to successfully pass on their genes. The same may be said for natural fathers, and this approach can be intelligently elaborated and extended to include altruism within larger, and less gene-related groups. It is a clever idea, one that can be usefully applied to understanding the organisation of various communities, including those of social insects such as bees, ants, termites and, of course, naked mole rats…! Yes, as strange as it may sound, one special species of subterranean rodents, the naked mole rats, have social structures closely related to those of the social insects, and the Darwinian approach explains this too, as Dawkins brilliantly elucidates in a chapter of his book The Selfish Gene. Yet there remains one puzzle that refuses such insightful treatment.

When I was seventeen I went off cycling with a friend. On the first day of our adventures into the wilderness that is North Wales, we hit a snag. Well, actually I hit a kerb, coming off my bike along a fast stretch of the A5 that drops steeply down into Betws-y-Coed – a route that my parents had expressly cautioned me not to take, but then as you know, boys will be boys. Anyway, as I came to a long sliding halt along the pavement (and not the road itself, as luck would have it), I noticed that a car on the opposite side had pulled up. Soon afterwards, I was being tended to by a very kindly lady. Improvising first aid using tissues from a convenient packet of wet-wipes, she gently stroked as much of the gravel from my wounds as she could. She calmed me, and she got me back on my feet, and without all her generous support we may not have got much further on our travels. I remain very grateful to this lady, a person who I am very unlikely to meet ever again. She helped me very directly, and she also helped me in another way, by teaching me one of those lessons of life that stick. For there are occasions when we all rely on the kindness of strangers, kindness that is, more often than not, as freely given as it is warmly received. Yet even such small acts of kindness pose a serious problem for Darwinian theory, at least, if it is to successfully explain all forms of animal and human behaviour. The question is simply this: when there is no reward for helping, why should anyone bother to stop?

Dawkins’ devotes an entire chapter of The Selfish Gene to precisely this subject. Taking an idea from “game theory” called “the prisoner’s dilemma”, he sets out to demonstrate that certain strategies of life that aim toward niceness are actually more likely to succeed than other more cunning and self-interested alternatives. His aim is to prove that contrary to much popular opinion “nice guys finish first”. But here is a computer game (and a relatively simple one at that), whereas life, as Dawkins knows full well, is neither simple nor a game. In consequence, Dawkins then grasps hold of another twig. Pointing out how humans are a special case – as if we needed telling…

As a species, he says, we have the unique advantage of being able to disrespect the programming of our own selfish genes. For supporting evidence he cites the use of contraception, which is certainly not the sort of thing that genes would approve of. But then why are we apparently unique in having this ability to break free of our instinctual drives? Dawkins doesn’t say. There is no explanation other than that same old recourse to just how extraordinarily clever we are – yes, we know, we know! Yet the underlying intimation is really quite staggering: that human beings have evolved to be so very, very, very clever, that we have finally surpassed even ourselves.

As for such disinterested acts of altruism, the kind of instance exemplified by the Samaritanism of my accidental friend, these, according to strict Darwinians such as Dawkins, must be accidents of design. A happy bi-product of evolution. A spillover. For this is the only explanation that evolutionary theory in its current form could ever permit.

Embedded below is one of a series of lectures given by distinguished geneticist and evolutionary biologist Richard Lewontin in 1990. The minutely detailed case he makes against the excesses of a Darwinian approach to human behaviour, as well as the latent ideology of socio-biology, is both lucid and persuasive:

*

Allow me now to drop a scientific clanger. My intention is to broaden the discussion and tackle issues about what Darwinism has to say about being human, and no less importantly, about being animal or plant. To this end then, I now wish to re-evaluate the superficially religious notion of “souls”; for more or less everything I wish to say follows from consideration of this apparently archaic concept.

So let me begin by making the seemingly preposterous and overtly contentious statement that just as Darwin’s theory in no way counters a belief in the existence of God, or gods as such, likewise, it does not entirely discredit the idea of souls. Instead, Darwin has eliminated the apparent need for belief in the existence of either souls or gods. But this is in no means the same as proving they do not exist.

Now, by taking a more Deistic view of Creation (as Darwin more or less maintained until late in his own life), one may accept the point about some kind of godly presence, for there is certainly room for God as an original creative force, and of some ultimately inscrutable kind, and yet it may still be contended that the idea of souls has altogether perished. For evolutionary theory establishes beyond all reasonable doubt that we are fundamentally no different from the other animals, or in essence from plants and bacteria. So isn’t it a bit rich then, clinging to an idea like human souls? Well, yes, if you put it that way, though we may choose to approach the same question differently.

My contention is that ordinary human relations already involves the notion of souls, only that we generally choose not to use the word soul in these contexts, presuming it to be outmoded and redundant. But perhaps given the religious weight of the word this will seem a scandalous contention, so allow me to elucidate. Everyday engagement between human beings (and no doubt other sentient animals), especially if one is suffering or in pain, automatically involves the feeling of empathy. So what then is the underlying cause of our feelings of empathy? – Only the most hard-nosed of behaviourists would dismiss it as a merely pre-programmed knee-jerk response.

Well, empathy, almost by definition, must mean that, in the other, we recognise a reflection of something found within ourselves. But then, what is it that we are seeing reflected? Do we have any name for it? And is not soul just as valid a word as any other? Or, to consider a more negative context, if someone commits an atrocity against others, then we are likely to regard this person as wicked. We might very probably wish to see this person punished. But how can anyone be wicked unless they had freedom to choose otherwise? So then, what part of this person was actually free? Was it the chemical interactions in their brain, or the electrical impulses between the neurons, or was it something altogether less tangible? And whatever the cause, we cannot punish the mass of molecular interactions that comprises their material being, because punishment involves suffering and molecules are not equipped to suffer. So ultimately we can only punish “the person within the body”, and what is “the person within the body” if not their soul?

But why is it, you may be wondering, that I want to rescue the idea of souls at all. For assuredly you may argue – and not without sound reason – that you have no want nor need for any woolly notions such as soul or spirit to encourage you to become an empathetic and loving person. You might even add that many of the cruellest people in history believed in the existence of the human soul. And I cannot counter you on either charge.

But let’s suppose that finally we have banished all notions of soul or spirit completely and forever – what have we actually achieved? And how do we give a fair account for that other quite extraordinary thing which is ordinary sentience. For quite aside from the subtle complexity of our moods and our feelings of beauty, of sympathy, of love, we must first account for our senses. Those most primary sensory impressions that form the world we experience – the redness of red objects, the warmth of fire, the saltiness of tears – the inexpressible, immediate, and ever-present streaming experience of conscious awareness that philosophers have called qualia. If there are no souls then what is actually doing the experiencing? And we should remember that here “the mind” is really nothing more or less, given our current ignorance, than a quasi-scientific synonym for soul. It is another name for the unnailable spook.

Might we have developed no less successfully as dumb automata? There is nothing in Darwin or the rest of science that calls on any requirement for self-conscious awareness to ensure our survival and reproduction. Nothing to prevent us negotiating our environment purely with sensors connected to limbs, via programmed instructions vastly more complex yet inherently no different from the ones that control this word processor, and optimised as super-machines that have no use for hesitant, stumbling, bumblingly incompetent consciousness. So what use is qualia in any case?

In purely evolutionary terms, I don’t need to experience the sensation of red to deal with red objects, any more than I need to see air in order to breathe. Given complex enough programs and a few cameras, future robots can (and presumably will) negotiate the world without need of actual sensations, let alone emotions. And how indeed could the blind mechanisms of dumb molecules have accidentally arranged into such elaborate forms to enable cognitive awareness at all? Darwin does not answer these questions – they fall beyond his remit. But then no one can answer these questions (and those who claim reasons to dismiss qualia on philosophical grounds, can in truth only dismiss the inevitably vague descriptions, rather than the ever-present phenomenon itself – or have they never experienced warmth, touched roughness nor seen red?).

And so the most ardent of today’s materialists wish to go further again. They want to rid the world of all speculation regarding the nature of mind. They say it isn’t a thing at all, but a process of the brain, which is conceivably true. (Although I’d add why stop at the brain?)

One fashionable idea goes that really we are “minding”, which is interesting enough given our accustomed error of construing the world in terms of objects rather actions; nouns coming easier than verbs to most of us. But then, whether the mind might be best represented by a noun or a verb seems for now, and given that we still know next to nothing in any neurological sense, to be purely a matter of taste.

The modern reductionism that reduces mind to brain, often throws up an additional claim. Such material processes, it claims, will one day be reproduced artificially in the form of some kind of highly advanced computer brain. Well, perhaps this will indeed happen, and perhaps one day we really will have “computers” that actually experience the world, rather than the sorts of machines today that simply respond to sensors in increasingly complex ways. I am speculating about machines with qualia: true artificial brains that are in essence just as aware as we are. But then how will we know?

Well, that’s a surprisingly tricky question and it’s one that certainly isn’t solved by the famous Turing Test, named after the father of modern computing, Alan Turing. For the Turing Test is merely a test of mimicry, claiming that if one day a computer is so cunningly programmed that it has become indistinguishable from a human intelligence then it is also equivalent. But that of course is nonsense. It is nonsense that reminds me of a very cunning mechanical duck someone once made: one that could walk like a duck, quack like a duck, and if rumours are to be believed, even to crap like a duck. A duck, however, it was not, and nor could it ever become one no matter how elaborate its clockwork innards. And as with ducks so with minds.

But let’s say we really will produce an artificial mind, and somehow we can be quite certain that we really have invented just such an incredible, epoch-changing machine. Does this mean that in the process of conceiving and manufacturing our newly conscious device, we must inevitably learn what sentience is of itself? This is not a ridiculous question. Think about it: do you need to understand the nature of light in order to manufacture a light bulb? No. The actual invention of light bulbs precedes the modern physical understanding. And do we yet have a full understanding of what light truly is, and is such a full understanding finally possible at all?

Yet there are a few scientists earnestly grappling with questions of precisely this kind, venturing dangerously near the forests and swamps of metaphysics, in search of answers that will require far better knowledge and understanding of principles of the mind. Maybe they’ll even uncover something like “the seat of the soul”, figuring out from whence consciousness springs. Though I trust that you will not misunderstand me here, for it is not that I advocate some new kind of reductionist search for the soul within, by means of dissection or the application of psychical centrifuges using high strength magnetic fields or some such. As late as the turn of the twentieth century, there was indeed a man called Dr. Duncan MacDougall, who had embarked on just such a scheme: weighing people at the point of death, in experiments to determine the mass of the human soul. 9 A futile search, of course, for soul – or mind – is unlikely to be in, at least in the usual sense, a substantial thing. And though contingent with life, we have no established evidence for its survival into death.

My own feeling is that the soul is no less mortal than our brains and nervous systems, on which it seemingly depends. But whatsoever it turns out to be, it is quite likely to be remain immeasurable – especially if we choose such rudimentary apparatus as a set of weighing scales for testing it. The truth is that we know nothing as yet, for the science of souls (or minds if you prefer) is still without its first principle. So the jury is out on whether or not science will ever explain what makes a human being a being at all, or whether is it another one of those features of existence that all philosophy is better served to “pass over in silence”.

Here is what respected cognitive scientist Steven Pinker has to say of sentience in his entertainingly presented and detailed overview of our present understanding of How the Mind Works:

“But saying that we have no scientific explanation of sentience is not the same as saying that sentience does not exist at all. I am as certain that I am sentient as I am certain of anything, and I bet you feel the same. Though I concede that my curiosity about sentience may never be satisfied, I refuse to believe that I am just confused when I think I am sentient at all! … And we cannot banish sentience from our discourse or reduce it to information access, because moral reasoning depends on it. The concept of sentience underlies our certainty that torture is wrong and that disabling a robot is the destruction of property but disabling a person is murder.” 10

*

There is a belief that is common to a camp of less fastidious professional scientists than Pinker, which, for the sake of simplicity, holds that consciousness, if it was ever attached at all, was supplied by Nature as a sort of optional add-on, in which every human experience is fully reducible to an interconnected array of sensory mechanisms and data-processing systems. Adherents to this view tend not to think too much about sentience, of course, and in rejecting their own central human experience, thereby commit a curiously deliberate act of self-mutilation that leaves only zombies fit for ever more elaborate Skinner boxes 11, even when, beyond their often clever rationalisations, we all share a profound realisation that there is far more to life than mere stimulus and response.

Orwell, wily as ever, was alert to such dangers in modern thinking, and reworking a personal anecdote into grim metaphor, he neatly presented our condition:

“… I thought of a rather cruel trick I once played on a wasp. He was sucking jam on my plate, and I cut him in half. He paid no attention, merely went on with his meal, while a tiny stream of jam trickled out of his severed œsophagus. Only when he tried to fly away did he grasp the dreadful thing that had happened to him. It is the same with modern man. The thing that has been cut away is his soul, and there was a period — twenty years, perhaps — during which he did not notice it.”

Whilst Orwell regards this loss as deeply regrettable, he also recognises that it was a very necessary evil. Given the circumstances, giving heed to how nineteenth century religious belief was “…in essence a lie, a semi-conscious device for keeping the rich rich and the poor poor…” he is nevertheless dismayed how all too hastily we’ve thrown out the baby with the holy bathwater. Thus he continues:

“Consequently there was a long period during which nearly every thinking man was in some sense a rebel, and usually a quite irresponsible rebel. Literature was largely the literature of revolt or of disintegration. Gibbon, Voltaire, Rousseau, Shelley, Byron, Dickens, Stendhal, Samuel Butler, Ibsen, Zola, Flaubert, Shaw, Joyce — in one way or another they are all of them destroyers, wreckers, saboteurs. For two hundred years we had sawed and sawed and sawed at the branch we were sitting on. And in the end, much more suddenly than anyone had foreseen, our efforts were rewarded, and down we came. But unfortunately there had been a little mistake. The thing at the bottom was not a bed of roses after all, it was a cesspool full of barbed wire.” 12

On what purely materialistic grounds can we construct any system of agreed morality? Do we settle for hedonism, living our lives on the unswerving pursuit of personal pleasure; or else insist upon the rather more palatable, though hardly more edifying alternative of eudaemonism, with its eternal pursuit of individual happiness? Our desires for pleasure and happiness are evolutionarily in-built, and it is probably fair to judge that most, if not all, find great need of both to proceed through life with any healthy kind of disposition. Pleasure and happiness are wonderful gifts, to be cherished when fortune blows them to our shore. Yet pleasure is more often short-lived, whilst happiness too is hard to maintain. So they hardly stand as rocks, providing little in the way of stability if we are to build solidly from their foundations. Moreover, they are not, as we are accustomed to imagine, objects to be sought after at all. If we chase either one then it is perfectly likely that it will recede ever further from our reach. So it is better, I believe, to look upon these true gifts as we find them, or rather, as they find us: evanescent and only ever now. Our preferred expressions of the unfolding moment of life. To measure our existence solely against them is however, to miss the far bigger picture of life, the universe and everything. 13

We might decide, of course, to raise the social above these more individualistic pursuits: settling on the Utilitarian calculus of increased happiness (or else reduced unhappiness) for the greatest number. But here’s a rough calculation, and one that, however subtly conceived, never finally escapes from its own deep moral morass. For Utilitarianism, though seeking to secure the greatest collective good, is by construction, blind to all evils as such, being concerned always and only in determining better or worse outcomes. The worst habit of Utilitarianism is to preference ends always above means. Lacking moral principle, it grants licence for “necessary evils” of every prescription: all wrongs being weighed (somehow) against perceived benefits.

We have swallowed a great deal of this kind of poison, so much that we feel uncomfortable in these secular times to speak of “acts of evil” or of “wickedness”. As if these archaic terms might soon be properly expurgated from our language. Yet still we feel the prick of our own conscience. A hard-wired sense of what is most abhorrent, combined with an innate notion of justice that once caused the child to complain “but it isn’t fair… it isn’t fair!”  Meanwhile, the “sickness” in the minds of others makes us feel sick in turn.

On what grounds can the staunchest advocates of materialism finally challenge those who might turn and say: this baby with Down’s Syndrome, this infant with polio, this old woman with Parkinson’s Disease, this schizophrenic, these otherwise healthy but unwanted babies or young children, haven’t they already suffered enough? And if they justify a little cruelty now in order to stave off greater sufferings to come, or more savagely still, claim that the greater good is served by the painless elimination of a less deserving few. What form should our prosecution take? By adopting a purely materialistic outlook then, we are collectively drawn, whether we wish it or not, toward the pit of nihilism. Even the existentialists, setting off determined to find meaning in the here and now, sooner or later recognised the need for some kind of transcendence, or else abandoned all hope.

*

Kurt Vonnegut was undoubtedly one of the most idiosyncratic of twentieth century writers. 14 During his lifetime, Vonnegut was often pigeonholed as a science fiction writer, and this was no doubt because his settings are very frequently in some way futuristic, because as science fiction goes, his stories are generally rather earth-bound. In general, Vonnegut seems more preoccupied with the unlikely interactions between his variety of freakish characters (many of whom reappear in different novels), than in using his stories as a vehicle to project his vision of the future itself. Deliberately straightforward, his writing is ungarnished and propelled by sharp, snappy sentences. He hated semi-colons, calling them grammatical hermaphrodites.

Vonnegut often used his talented imagination to tackle the gravest of subjects, clowning around with dangerous ideas, and employing the literary equivalent of slapstick comedy to puncture human vanity and to make fun of our grossest stupidities. He liked to sign off chapters with a hand-drawn asterisk, because he said it represented his own arsehole. As a satirist then, he treads a path that was pioneered by Swift and Voltaire; of saying the unsayable but disguising his contempt under the cover of phantasy. He has become a favourite author of mine.

In 1996, he was awarded the title of American Humanist of the Year. In his acceptance speech, he took the opportunity to connect together ideas that had contributed to his own understanding of what it meant to be a humanist; ideas that ranged over a characteristically shifting and diverse terrain. Here were his concluding remarks:

“When I was a little boy in Indianapolis, I used to be thankful that there were no longer torture chambers with iron maidens and racks and thumbscrews and Spanish boots and so on. But there may be more of them now than ever – not in this country but elsewhere, often in countries we call our friends. Ask the Human Rights Watch. Ask Amnesty International if this isn’t so. Don’t ask the U.S. State Department.

And the horrors of those torture chambers – their powers of persuasion – have been upgraded, like those of warfare, by applied science, by the domestication of electricity and the detailed understanding of the human nervous system, and so on. Napalm, incidentally, is a gift to civilization from the chemistry department of Harvard University.

So science is yet another human-made God to which I, unless in a satirical mood, an ironical mood, a lampooning mood, need not genuflect.” 15

*

Rene Descartes is now most famous for having declared, “cogito ergo sum”, which means of course “I think therefore I am”. It was a necessary first step, or so he felt, to escape from the paradox of absolute skepticism, which was the place he had chosen to set out at the beginning of his metaphysical meditations. What Descartes was basically saying was this: look here, I’ve been wondering whether I exist or not, but now having caught myself in the act, I can be sure that I do – for even if I still must remain unsure of everything else besides, I cannot doubt that I am doubting. It is important to realise here that Descartes’ proposition says more than perhaps first meets the eye. After all, he intends it as a stand-alone proof and thus to be logically self-consistent, and the key to understanding how is in his use of the word “therefore”. “Therefore” automatically implying his original act of thinking. If challenged then, to say how he can be certain even in that he is thinking, Descartes’ defence relies upon the very act of thinking (or doubting, as he later put it 16) described in the proposition. Thinking is undeniable, Descartes is saying, and my being depends on this. Yet this first step is already in error, and importantly, the consequences of this error are resonant still throughout modern western thought.

Rene Descartes, a Christian brought up to believe that animals had no soul (as Christians are wont to do), readily persuaded himself that they therefore felt no pain. It was a belief that permitted him to routinely perform horrific experiments in vivisection (he was a pioneer in the field). I mention this because strangely, and in spite of Darwin’s solid refutation of man’s pre-eminence over beasts, animal suffering is still regarded as entirely different in kind to human suffering, even in our post-Christian society. And I am sorry to say that scientists are hugely to blame for this double standard. Barbaric experimentation, most notoriously in the field of psychology, alongside unnecessary tests for new products and new weapons, are still performed on every species aside from ours, whilst in more terrible (and shamefully recent) times, when scientists were afforded licence to redraw the line above the species level, their subsequent demarcations made on grounds of fitness and race, the same cool-headed objectivity was applied to the handicapped, to prisoners of war, and to the Jews. It is better that we never forget how heinous atrocities have too often been committed in the name and pursuit of coldly rational science.

Rene Descartes still has a role to play in this. For by prioritising reason in order to persuade himself of his own existence, he encouraged us to follow him into error. To mix up our thinking with our being. To presume that existence is somehow predicated on reasoning, and not, at least not directly, because we feel, or because we sense, or most fundamentally, because we are.  If it is rationality that sets us apart from the beasts, then we exist in a fuller sense than the beasts ever can.

To be absolutely certain of the reality of a world beyond his mind, however, Descartes needed the help of God.  Of a living God of Truth and Love. For if were it not for the certainty of God’s existence, Descartes argued, his mind – though irrefutably extant – might yet be prey to the illusions of some kind of a “deceitful daemon”. Being nothing more than a brain in a tank, to give his idea a modern slant, and plugged into what today would most probably be called The Matrix.

Thus realising that everything he sensed and felt might conceivably be an elaborately constructed illusion, only Descartes’ profound knowledge of a God of Truth – a God who made the world as true and honest as it appeared to be – could save his philosophy from descent into pure solipsism. But this primary dualism of mind and world is itself the division of mind and body – a division of self – while to regard Reason as the primary and most perfect attribute of being, obviously established the mind above the body, and, more generally, spirit above matter. This is the lasting lesson Descartes taught and it is a lesson we have committed so deeply to our Western consciousness that we have forgotten we ever learnt it in the first place.

The significant difference in today’s world of science, with God now entirely outside of the picture, is that Descartes’ hierarchy has been totally up-ended. Matter is the new boss, and mind, its servant. 17

*

But we might also turn this whole issue on its head. We might admit the obvious. Concede that although we don’t know what it is exactly, there is some decidedly strange and immaterial part to ourselves. That it is indeed the part we most identify with – the part we refer to so lovingly as “I”. And that it is this oh-so mysterious part of us which provides all our prima facie evidence for existence itself. Though in admitting this, the question simply alters. It becomes: how to account for the presence of such a ghost inside our machines? For what outlandish contrivance would we need to reconnect the matter of our brains with any such apparently in-dwelling spirit? And whereas Rene Descartes once proposed that mind and body might be conjoined within the mysterious apparatus of our pineal gland (presumably on the grounds that the pineal gland is an oddly singular organ), we know better and so must look for less localised solutions. In short then, we may finally need to make a re-evaluation of ourselves, not merely as creatures, but as manifestations of matter itself.

Yet, in truth, all of this is really a Judeo-Christian problem; a deep bisection where other traditions never made any first incision. For what is “matter” in any case? Saying it’s all atoms and energy doesn’t give a final and complete understanding. Perhaps our original error was to force such an irreconcilable divorce between nebulous soul (or mind) and hard matter, when they are so indivisibly and gloriously codependent, for though Science draws a marked distinction between the disciplines of physics and psychology, it only stands for sake of convenience; for sake, indeed, of ignorance.

To begin then, let’s try to re-establish some sense of mystery regarding the nature of matter itself – such everyday stuff that we have long taken for granted that through careful measurements and mathematical projections its behaviour can be understood and predicted. Here indeed, Freeman Dyson brings his own expertise in quantum theory, combined with his genius for speculation, to consider the fascinating subject of mind and its relationship to matter:

“Atoms in the laboratory are weird stuff, behaving like active agents rather than inert substances. They make unpredictable choices between alternative possibilities according to the laws of quantum mechanics. It appears that mind, as manifested by the capacity to make choices, is to some extent inherent in every atom. The universe as a whole is also weird, with laws of nature that make it hospitable to the growth of mind.”

Dyson is drawing upon his very deep understanding of quantum physics, and yet already he has really said too much. Quantum choice is not the same as human choice. Quantum choice depends on random chance, which is the reason Einstein famously asserted, “God does not play dice”. Indeed I’m not sure how quantum theory, as it is currently understood, could ever account for the existence of free will and volition, quite aside from the overriding mystery of sentience itself. So Dyson’s more important point is perhaps his last one: that the universe is “hospitable for the growth of mind”. This is too often overlooked. And for Dyson, it offers reason enough for religious contemplation:

“I do not make any clear distinction between mind and God. God is what mind becomes when it has passed beyond the scale of our comprehension. God may be either a world-soul or a collection of world-souls. So I am thinking that atoms and humans and God may have minds that differ in degree but not in kind.” 18

I share with Dyson the opinion that it is better to relish these mysteries rather than to retreat to the dry deception of material certainty. For, as Shakespeare summed up so marvelously in his final play The Tempest: “we are such stuff as dreams are made on…”19 And perhaps this is still the best description we have of ourselves, even though we have no idea whatsoever, how as dream-machines, our dreams are woven.

A toast then! Feel free to join me in raising your glass… to your own mind, your psyche, your soul, call it what you will – a rose by any other name and all that. Three cheers! And to consciousness! To sentience! To uncanny awareness! That same stuff all our dreams are made on…

So with great appreciation and warm affection, here’s to that strangest of things: that thing I so very casually call my-self! But even more than this. To the actual stuff of our lives, to the brain, the entire central nervous system and far beyond. To the eyes and ears and fingertips; to the whole apparatus of our conscious awareness; and to the sentience of all our fellows, whether taking human or other forms! To the strangeness of the material world itself, from which all sentience has miraculously sparked! To the vast and incomprehensible Universe no less, whether manifestly inward or outward, for the distinction may be a finer one than we are in the habit to presume! Here’s to wondering what we are… Drink up!

Next chapter…

*

John Searle is a philosopher who has closely studied the nature of consciousness and concludes that although unique amongst biological phenomena, mind, though mysterious, is obviously a natural function of brain activity. In this lecture he summarises the many failures of the current “scientific” approach to questions of consciousness:

In the interview below Searle discusses why he rejects both the hard-line materialist dismissal of consciousness as an illusion (which is actually nonsensical) and dualist alternatives that rely upon a false division between mind and matter:

And finally, Searle outlines the main difficulties surrounding the unresolved philosophical paradox of free will – put succinctly he says although it is impossible to prove human beings have free will and any capacity for free will also seems to defy physical causality, we are compelled to experience conscious rational decision-making on a daily basis:

*

Addendum: the return of Frankenstein!

The issues surrounding the use of genetically modified organisms (GMOs) are many and complex, but it is perfectly clear that new developments in genetics, like those in nuclear physics more than half a century ago, have automatically opened the door to some quite extraordinary possibilities. Possibilities that will most assuredly impact our future no less dramatically than the advent of atomic reactors and the hydrogen bomb impacted our very recent past – and still continue to affect us today.

The need for a proper debate is long overdue but, hardly surprisingly, the huge bio-tech corporations prefer to keep the debate closed down. Monsanto, for instance, who claim that its perfectly safe to release their GMOs directly into our environment, were also in the habit of claiming that their herbicide Roundup is so harmless you can drink it! 20 But then why on earth would anyone (or at least anyone not in their pocket) trust such self-interested and deliberately compromised risk assessments? The short answer is that the precautionary principle has once again been overridden by money and influence.

What we really need, of course, is a proper debate about the use of genetic modification. A debate that is open and public: a forum for discussion amongst leading experts (and especially those not associated with the powerful bio-tech firms); scientists from other fields, who though ignorant on specifics, might bring a detached expertise by virtue of familiarity with scientific procedures; alongside representatives from other interested parties such as ‘consumers’ (that’s the rest of us by the way – we all consume, and though I hate the word too, it at least offers a little better perspective on our role without the current system, since this is how the system itself defines us).

This great debate needs to be fully inclusive, welcoming intelligent opinion, whether concordant or dissenting. No reasoned objections from any quarters being summarily dismissed as unscientific or anti-scientific, as is so often the case, because we must never leave it for technicians alone to decide on issues that so directly affect our common future. Relying on highly specialised experts alone – even when those experts are fully independent (as they so rarely are these days) –  would be as unwise as it is anti-democratic.

Genetic manipulation is already upon us. It is already helping in the prevention and treatment of diseases, and in the production of medicines such as insulin (although even here serious questions are arising with regards to the potentially harmful side-effects of using a genetically modified product). More controversial again is the development of pest- and drought-resistant strains of crops; developments that are claimed by their producers to have alleviated a great deal of human suffering already, but which seem to have brought misery of new kinds – I will come back to this later.

And then we come to the development of Genetic Use Restriction Technology (Gurt), better known as ‘suicide’ or ‘Terminator’ (to use the industry term) seeds, which are promoted by the industry as a ‘biosafety’ solution. Engineered sterility being a clever way of preventing their own genetically modified plants from causing unwanted genetic contamination – which we might think of as a new form of pollution. The argument being that if modified genes (whether pharmaceutical, herbicide resistance or ‘Terminator’ genes) from a ‘Terminator’ crop get transferred to related plants via cross-pollination, the seed produced from such pollination will be sterile. End of problem.

But this is merely an excuse, of course, and if used in this way, the new technology will ultimately prevent over a billion of the poorest people in the world from continuing in their age-old practice of saving seeds for resowing, which will, as a consequence, make these same farmers totally dependent on a few multinational bio-tech companies. All of which serves as an excellent means for monopolising the world’s food supplies, and offers a satisfactory solution only for the owners of companies like Monsanto. 21

In any case, do we really wish to allow patents on specific genes, opening the door to the corporate ownership of the building blocks to life itself? The world renowned physicist and futurist visionary Freeman Dyson draws a direct comparison to earlier forms of slavery:

“The institution of slavery was based on the legal right of slave-owners to buy and sell their property in a free market. Only in the nineteenth century did the abolitionist movement, with Quakers and other religious believers in the lead, succeed in establishing the principle that the free market does not extend to human bodies. The human body is God’s temple and not a commercial commodity. And now in the twenty-first century, for the sake of equity and human brotherhood, we must maintain the principle that the free market does not extend to human genes.” 22

Nor, I would quickly add, should it extend to the ownership of genes of other higher species of animal or plant life. Moreover, I personally have no wish whatsoever for apples, tomatoes, potatoes (or even tobacco) that provides the RDA for all my nutritional needs, or any other supposed improvement on the original designs – preferring to trust to apples, tomatoes and potatoes that evolved alongside my own human digestive system. And this ought not to be treated as merely a preference, but established as a human right, since we all have the right not to eat GMO just as we have the right to be vegan (not that I’m a vegan, by the way).

Beyond this, we also need to consider the many perfectly serious and inescapable ethical issues that arise once you are tinkering with the primary source code of life itself. Take cloning as an interesting example.

Identical twins are essentially clones, having both developed from the same fertilised egg, and thus sharing the same DNA. But then nature sometimes goes one step further again:

A form of virgin birth has been found in wild vertebrates for the first time.

Researchers in the US caught pregnant females from two snake species and genetically analysed the litters.

That proved the North American pit vipers reproduced without a male, a phenomenon called facultative parthenogenesis that has previously been found only in captive species. 23

I have since learned that parthenogenesis (reproduction without fertilisation or “virgin birth”) is surprisingly common throughout the plant and animal kingdoms. Birds do it, bees do it… and even mammals have been induced to do it. So cloning is not inherently unnatural, and if carried out successfully (as it frequently is in nature), it may one day be no more harmful nor fraught with latent dangers to be a cloned individual than an individual produced by other forms of artificial reproduction. Furthermore, since we already know what human twins are like, we already know what human clones will be like. Yet many ethical questions still hang.

For instance, should anyone be allowed to clone themselves? Or more generally, who chooses which of us are to be cloned? Do we just leave it to the market to decide? And why would we ever want a world populated by identical (or rather, approximately identical – since no two twins are truly identical and there are sound biological reasons for believing clones will never be perfectly reproduced either) human beings? Such ethical questions are forced by the new biotechnologies. And there are many further reasons for why ordinary, intelligent public opinion needs to be included in the debate.

Here is Freeman Dyson again, summarising his own cautious optimism as we enter the age of the new ‘green technologies’:

“I see two tremendous goods coming from biotechnology in the next century, first the alleviation of human misery through progress in medicine, and second the transformation of the global economy through green technology spreading wealth more equitably around the world. The two great evils to be avoided are the use of biological weapons and the corruption of human nature by buying and selling genes. I see no scientific reason why we should not achieve the good and avoid the evil.

The obstacles to achieving the good are political rather than technical. Unfortunately a large number of people in many countries are strongly opposed to green technology, for reasons having little to do with the real dangers. It is important to treat the opponents with respect, to pay attention to their fears, to go gently into the new world of green technology so that neither human dignity nor religious conviction is violated. If we can go gently, we have a good chance of achieving within a hundred years the goals of ecological sustainability and social justice that green technology brings within our reach.” 24

Dyson is being too optimistic no doubt with many of the dangers of GMOs slowly coming to light more two decades after Dyson uttered these words as part of his acceptance speech for the award of the Templeton Prize in 2000.

Meanwhile in 2012, Greenpeace issued the following press release. It contains the summary of an open letter sent by nearly a hundred Indian scientists to the Supreme Court of India:

An official report submitted by the technical Expert committee set up by the Supreme Court of India comprising of India’s leading experts in molecular biology, toxicology and biodiversity – unanimously recommends a 10-year moratorium on all field trials of GM Bt [insecticide producing due to genes from Bacillus thuringiensis] food crops, due to serious safety concerns. The committee has also recommended a moratorium on field trials of herbicide tolerant crops until independent assessment of impact and suitability, and a ban on field trials of GM crops for which India is center of origin and diversity.

The report’s recommendations are expected put a stop to all field releases of GM food crops in India, including the controversial Bt eggplant, whose commercial release was put under an indefinite moratorium there last February 2010. Contrarily, the same Bt eggplant is currently being evaluated for approval in the Philippines.

“This official unanimous declaration on the risks of GMOs, by India’s leading biotech scientists is the latest nail on the coffin for GMOs around the world,” said Daniel M. Ocampo, Sustainable Agriculture Campaigner of Greenpeace Southeast Asia. “It is yet another proof that GMOs are bad for the health, bad for the environment, bad for farmers and bad for the economy.” 25

For though it would be foolish to fail to recognise the enormous potential benefits of some of the new ‘green technologies’, any underestimate of the hazards is sheer recklessness. And this is where my own opinion differs significantly from enthusiasts like Dyson. This science is just so brilliantly new, and so staggeringly complex. The dangers are real and very difficult to over-estimate and so public concern is fully justified whether over health and safety issues, over the politico-economic repercussions, or due to anxieties of a more purely ethical kind.

*

Please note that for the purposes of ‘publishing’ here I have taken advantage of the option to incorporate hypertext links and embed videos – in order to distinguish additional commentary from the original text all newly incorporated text has been italised.

1 There is sound evidence for believing that protons and neutrons are made of quarks, whereas electrons it seems are a type of fundamental particle which has no component parts.

2 My use of the analogue/digital comparison is simplistic, of course, but then it is only intended as a loose analogy, nothing more.

3 Since writing this I have come upon a range of so-called Young Earth Theories of Geology that contradict my former opinion. Apparently there are indeed groups of Creationists intent on disproving ideas of a 4.5 billion year old planet in favour of a ten thousand year prehistory. Needless to say there is no supporting evidence for this contention.

4

“Cro-magnons are, in informal usage, a group among the late Ice Age peoples of Europe. The Cro-Magnons are identified with Homo sapiens sapiens of modern form, in the time range ca. 35,000-10,000 b.p. […] The term “Cro-Magnon” has no formal taxonomic status, since it refers neither to a species or subspecies nor to an archaeological phase or culture. The name is not commonly encountered in modern professional literature in English, since authors prefer to talk more generally of anatomically modern humans (AMH). They thus avoid a certain ambiguity in the label “Cro-Magnon”, which is sometimes used to refer to all early moderns in Europe (as opposed to the preceding Neanderthals), and sometimes to refer to a specific human group that can be distinguished from other Upper Paleolithic humans in the region. Nevertheless, the term “Cro-Magnon” is still very commonly used in popular texts because it makes an obvious distinction with the Neanderthals, and also refers directly to people rather than to the complicated succession of archaeological phases that make up the Upper Paleolithic. This evident practical value has prevented archaeologists and human paleontologists – especially in continental Europe – from dispensing entirely with the idea of Cro-Magnons.”

Taken from The Oxford Companion to Archaeology. Oxford, UK: Oxford University Press. p. 864.

5

“Jambo, Jersey Zoos world famous and much loved silverback gorilla had a truly remarkable life. He was born in Basel Zoo in Switzerland in 1961. He arrived at Jersey Zoo on the 27th April 1972. Jambo, Swahili for Hello, is perhaps better known to the public for the gentleness he displayed towards the little boy who fell into the gorilla enclosure at Jersey Zoo one afternoon in 1986. The dramatic event hit the headlines and helped dispel the myth of gorillas as fearsome and ferocious. It was a busy Sunday afternoon in August 1986 when an incredulous public witnessed Levan Merritt a small boy from Luton UK fall into the Gorilla enclosure at Jersey Zoo. “

Extract taken from “The Hero Jambo”, a tribute to Jambo written by the founder of Jersey Zoo, Gerald Durrell.

6

“LAST SUMMER, AN APE SAVED a three-year-old boy. The child, who had fallen 20 feet into the primate exhibit at Chicago’s Brookfield Zoo, was scooped up and carried to safety by Binti Jua, an eight-year-old western lowland female gorilla. The gorilla sat down on a log in a stream, cradling the boy in her lap and patting his back, and then carried him to one of the exhibit doorways before laying him down and continuing on her way.”

Extract taken from article by F. B. M. de Waal (1997) entitled “Are we in anthropodenial? Discover 18 (7): 50-53.”

7   

“Binti became a celebrity overnight, figuring in the speeches of leading politicians who held her up as an example of much-needed compassion. Some scientists were less lyrical, however. They cautioned that Binti’s motives might have been less noble than they appeared, pointing out that this gorilla had been raised by people and had been taught parental skills with a stuffed animal. The whole affair might have been one of a confused maternal instinct, they claimed.”

Ibid.

8 Quoted in an article entitled: “Confessions of a Lonely Atheist: At a time when religion pervades every aspect of public life, there’s something to be said for a revival of pagan peevishness”, written by Natalie Angier for The New York Times Magazine, from January 14, 2001.

9 In 1907, MacDougall weighed six patients who were in the process of dying (accounts of MacDougall’s experiments were published in the New York Times and the medical journal American Medicine). He used the results of his experiment to support the hypothesis that the soul had mass (21 grams to be precise), and that as the soul departed the body, so did its mass. He also measured fifteen dogs under similar conditions and reported the results as “uniformly negative”. He thus concluded that dogs did not have souls. MacDougall’s complaints about not being able to find dogs dying of the natural causes have led at least one author to conjecture that he was in fact poisoning dogs to conduct these experiments.

10 Extract taken from Chapter 2, “Thinking Machines” of Steven Pinker’s How the Mind Works, published by Penguin Science, 1997, p 148. Italics in the original.

11 An operant conditioning chamber (sometimes known as a Skinner box) is a laboratory apparatus developed by BF Skinner, founding father of “Radical Behaviourism”, during his time as a graduate student at Harvard University. It is used to study animal behaviour and investigate the effects of psychological conditioning using programmes of punishment and reward.

12 Extract taken from Notes on the Way by George Orwell, first published in Time and Tide, London, 1940.

13  I received a very long and frank objection to this paragraph from one of my friends when they read through a draft version, which I think is worth including here in the way of balance:

“I must explain that I’m a hedonist to a ridiculous degree, so much so that my “eudaemonism” (sounds dreadful –not like happiness-seeking at all!) is almost completely bound up with the pursuit of pleasure, as for me there is little difference between a life full of pleasures and a happy life.  Mind you, pleasure in my definition (as in most people’s, I guess) covers a wide array of things: from the gluttonous through to the sensuous, the aesthetic, the intellectual and even the spiritual; and I would also say that true pleasure is not a greedy piling up of things that please, but a judicious and even artistic selection of the very best, the most refined and the least likely to cause pain as a side effect  (I think this approach to pleasure is called “Epicureanism”).

Love, of course, is the biggest source of pleasure for most, and quite remarkably, it’s not only the receiving but the giving of it that makes one truly happy, even when some pain or sacrifice is involved.  This is how I explain acts of generosity like the one you describe, by the woman who helped you when you fell off your bike as a teenager: I think she must have done it because, despite the bother and the hassle of the moment, deep down it made her happy to help a fellow human being. We have all felt this way at some point or other, and as a result I believe that pleasure is not antithetical to morality, because in fact we can enjoy being kind and it makes us unhappy to see suffering around us. This doesn’t mean that we always act accordingly, and we certainly have the opposite tendency, too: there is a streak of cruelty in every human that means under some circumstances, we’ll enjoy hurting even those we love. But my point is, hedonism and a concern for others are not incompatible. The evolutionary reason for this must be that we are a social animal, so empathy is conducive to our survival as much as aggression and competitiveness may be in some environments. In our present environment, i.e. a crowded planet where survival doesn’t depend on killing lions but on getting on with each other, empathy should be promoted as the more useful of the two impulses. This isn’t going to happen, of course, but in my opinion empathy is the one more likely to make us happy in the long run.

Having attempted to clean up the name of pleasure a bit, I’ll try to address your other complaints against a life based on such principles: “Yet pleasure is more often short-lived, whilst happiness too is hard to maintain.” I agree, and this is indeed the Achilles heel of my position: I’m the most hypochondriac and anxiety-prone person I know, because as a pleasure-a-holic and happiness junkie I dread losing the things I enjoy most. The idea of ever losing [my partner], for example, is enough to give me nightmares, and I’m constantly terrified of illness as it might stop me having my fun. Death is the biggest bogie. I’m not blessed with a belief in the afterlife, or even in the cosmic harmony of all things. This is [my partner]’s belief as far as I can tell, and I’d like to share it, but I’ve always been an irrational atheist – I haven’t arrived at atheism after careful thinking, but quite the opposite, I’ve always been an atheist because I can’t feel the godliness of things, so it is more of a gut reaction with me. The closest thing to the divine for me is in beauty, the beauty of nature and art, but whether Beauty is Truth, I really don’t know, and in any case beauty, however cosmic, won’t make me immortal in any personal or individual sense. I’m horrified at the idea of ceasing to exist, and almost as much at the almost certain prospect of suffering while in the process of dying. This extreme fear is probably the consequence of my hedonist-epicurean-eudaemonism.

On the other hand, since everyone, including the most religious and ascetic people, is to some extent afraid of dying, is it really such a big disadvantage to base one’s life on the pursuit of pleasure and happiness? I guess not, although I must admit that I’d quite like to have faith in the Beyond. I suppose that I do have some of the agnostic’s openness to the mystery of the universe – as there are so many things that we don’t understand, and perhaps we aren’t even equipped to ever understand, it’s very possible that life and death have a meaning that escapes us. This is not enough to get rid of my fears, but it is a consolation at times.

Finally, I also disagree with you when you say that pleasure and happiness “are not, as we are accustomed to imagine, objects to be sought after at all. If we chase either one then it is perfectly likely that it will recede ever further from our reach.” There’s truth in this, but I think it’s also true that unless one turns these things into a priority, it is very difficult to ever achieve them. I for one find that more and more, many circumstances in my life conspire to stop me having any fun: there are painful duties to perform, ailments to cope with, bad news on a daily basis and many other kinds of difficulties, so if I didn’t insist on being happy at least a little every day, I’d soon forget how to do it. I’m rather militant about it, in fact. I’m always treating myself in some way, though to be fair to myself, a coffee and a croissant can be enough to reconcile me to a bad day at work, for example, so I’m not really very demanding. But a treat of some sort there has to be to keep me going. Otherwise, I don’t see the point.”

14  Kurt Vonnegut had originally trained to be a scientist, but says he wasn’t good enough. His older brother Bernard trained as a chemist and is credited with the discovery that sodium iodide could be used to force precipitation through “cloud seeding”. If you ask for Vonnegut in a library, you’ll probably be directed toward the Science Fiction section, since many of his books are set in strangely twisted future worlds. However, his most famous and most widely acclaimed work draws on experiences during the Second World War, and in particular on the Allied fire-bombing of Dresden. Vonnegut had personally survived the attack by virtue of being held as prisoner of war in an underground meat locker, and the irony of this forms the title of the novel, Slaughterhouse-five.

15  Extract taken from “Why My Dog Is Not a Humanist” by Kurt Vonnegut, published in Humanist, Nov 92, Vol. 52:6.5-6.

16 “We cannot doubt existence without existing while we doubt…” So begins Descartes seventh proposition from his 76 “Principles of Human Knowledge” which forms Part 1 of Principia philosophiae (Principles of Philosophy) published in Latin in 1644 and reprinted in French in 1647 – ten years after his groundbreaking treatise Discourse on the Method in which “Je pense, donc je suis” (“I think, therefore I am”) had first appeared.

http://www.gutenberg.org/cache/epub/4391/pg4391.html

17 A more poetic version of Descartes’ proof had already been constructed centuries earlier by early Islamic scholar, Avicenna, who proposed a rather beautiful thought experiment in which we imagine ourselves falling or else suspended, and thus isolated and devoid of all sensory input including any sense of our own body. The “floating man”, Avicenna says, in spite of complete absence of any perceptions of a world beyond, would nevertheless possess self-awareness. That he can still say “I am” proves that he is self-aware and that the soul exists. In consequence, Avicenna also places soul above material, although no priority is granted to reason above our other forms of cognition.

18  Further extracts from Freeman Dyson ‘s acceptance speech for the award of the Templeton Prize, delivered on May 16, 2000 at the Washington National Cathedral.

19  Prospero in Shakespeare’s The Tempest, Act IV, Scene 1.

20 In 1996, the New York Times reported that: “Dennis C. Vacco, the Attorney General of New York, ordered the company to pull ads that said Roundup was ‘safer than table salt’ and ‘practically nontoxic’ to mammals, birds and fish. The company withdrew the spots, but also said that the phrase in question was permissible under E.P.A. guidelines.”

Extract taken from wikipedia with original reference retained. http://en.wikipedia.org/wiki/Monsanto#False_advertising

21 For further arguments against “Terminator Technology”, I recommend the following website: http://www.banterminator.org/content/view/full/233

22 From Freeman Dyson’s acceptance speech for the award of the Templeton Prize, delivered on May 16, 2000 at the Washington  National Cathedral.

23  From an article entitled “Virgin births discovered in wild snakes” written by Jeremy Coles, published by BBC nature on September 12, 2012. http://www.bbc.co.uk/nature/19555550

24  Also from Freeman Dyson’s acceptance speech for the award of the Templeton Prize.

25 http://www.greenpeace.org/seasia/ph/press/releases/GMOs-declared-unsafe-in-India-Greenpeace-calls-on-PH-to-follow-suit/

This original link has since been removed but the same article can be read here:

https://web.archive.org/web/20130607155209/http://www.greenpeace.org/seasia/ph/press/releases/GMOs-declared-unsafe-in-India Greenpeace-calls-on-PH-to-follow-suit/

Leave a comment

Filed under « finishing the rat race », GMO

apes of wrath?

The following article is Chapter Three of a book entitled Finishing The Rat Race.

All previously uploaded chapters are available (in sequence) by following the link above or from category link in the main menu, where you will also find a table of contents and a preface on why I started writing it.

*

What a piece of work is a man!

— William Shakespeare 1

*

Almost a decade ago, as explosions lit up the night sky above Baghdad, I was at my parents’ home in Shropshire, sat on the sofa, and watching the rolling news coverage. After a few hours we were still watching the same news though for some reason the sound was now off and the music system on.

“It’s a funny thing,” I remarked, between sips of whisky, and not certain at all where my words were leading, “that humans can do this… and yet also… this.” I suppose that I was trying to firm up a feeling. A feeling that arose in response to the unsettling juxtaposition of images and music, and that involved my parents and myself in different ways, as detached spectators. But my father didn’t understand at first, and so I tried again.

“I mean how can it be,” I hesitated, “that on the one hand we are capable of making such beautiful things like music, and yet on the other, we are the engineers of such appalling acts of destruction?” Doubtless I could have gone on elaborating, but there was no need. My father understood my meaning, and the evidence of what I was trying to convey was starkly before us – human constructions of the sublime and the atrocious side-by-side.

In any case, the question, being as it is, a question of unavoidable and immediate importance to all of us, sort of hangs in the air perpetually, although as a question, it is usually considered and recast in alternative ways – something I shall return to – while mostly it remains not merely unanswered, but unspoken. We treat it instead like an embarrassing family secret, which is best forgotten. Framed hesitantly but well enough for my father to reply, his answer was predictable too: “that’s human nature”; which is the quick and easy answer although it actually misses the point entirely – a common fallacy technically known as ignoratio elenchi. For ‘human nature’ in no way provides an answer but simply opens a new question. Just what is human nature? – This is the question.

The generous humanity of music and the indiscriminate but cleverly conceived cruelty of carpet bombing are just different manifestations of what human beings are capable of, and thus of human nature. If you point to both and say “this is human nature”, well yes –and obviously there’s a great deal else besides – whereas if you reserve the term only for occasions when you feel disapproval, revulsion or outright horror – as many do – then your condemnation is simply another feature of “human nature”. In fact, why do we judge ourselves at all?

So this chapter represents an extremely modest attempt to grapple with what is arguably the most complex and involved question of all questions. Easy answers are good when they cut to the bone of a difficult problem, however to explain man’s inhumanity to man as well as to his other fellow creatures, surely deserves a better and fuller account than that man is by nature inhumane – if for no other reason than that the very word ‘human’ owes its origins to the earlier form ‘humane’! Upon this etymological root is there really nothing else but vainglorious self-deception and wishful thinking? I trust that language is in truth less consciously contrived.

The real question then is surely this: When man becomes inhumane, why on this occasion or in this situation, but not on all occasions and under all circumstances? And how come we still use the term ‘inhumane’ at all, if being inhumane is so hard-wired into our human nature? The lessons to be learned by tackling such questions can hardly be overstated; lessons that might well prove crucial in securing the future survival of our societies, our species, and perhaps of the whole planet.

*

I        Monkey business

There are one hundred and ninety-three living species of monkeys and apes. One hundred and ninety-two of them are covered with hair.”

— Desmond Morris 2

*

The scene: just before sunrise about one million years BC, a troop of hominids are waking up and about to discover a strange, rectangular, black monolith that has materialised from nowhere. As the initial excitement and fear of this strange new object wears off, the hominids move closer to investigate. Attracted perhaps by its remarkable geometry, its precise and unnatural blackness, they reach out tentatively to touch it and then begin to stroke it.

As a direct, though unexplained consequence of this communion, one of the ape-men has a dawning realisation. Sat amongst the skeletal remains of a dead animal, he picks up one of the sun-bleached thigh bones and begins to swing it about. Aimless at first, his flailing attempts simply scatter the other bones of the skeleton. In time, however, he gains control and his blows increase in ferocity, until at last, with one almighty thwack, he manages to shatter the skull to pieces. It is a literally epoch-making moment of discovery.

The following day, mingling beside a water-hole, a fight breaks out. His new weapon in hand, our hero deals a fatal blow against the alpha male of a rival troop. Previously at the mercy of predators and reliant on scavenging to find their food, the tribe can now be freed from fear and hunger too. Triumphant, he is the ape-man Prometheus, and in ecstatic celebration of this achievement, he tosses the bone high into the air, whereupon, spinning up and up, higher and higher into the sky, the scene cuts from spinning bone into an orbiting space-craft…

*

Stanley Kubrick’s 2001: A space odyssey is enigmatic and elusive. Told in a sequence of related if highly differentiated parts, it repeatedly confounds the viewer’s expectations – the scene sketched above is only the opening act to Kubrick’s seminal science-fiction epic.

Kubrick said “you are free to speculate as you wish about the philosophical and allegorical meaning of the film” 3 So taking Kubrick at his word, I shall do just that – although not for every aspect of the film, but specifically for his first scene, up to and including that most revered and celebrated ‘match cut’ in cinema history, and its relationship to Kubrick’s mesmerising and seemingly bewildering climax: moments of transformation, when reality per se is re-imagined. Although on one level, at least, all of the ideas conveyed in this opening as well as the more mysterious closing scenes (more below) are abundantly clear. For Kubrick’s exoteric message involves the familiar Darwinian interplay between the foxes and the rabbits and their perpetual battle for survival, which is the fundamental driving force behind the evolutionary development of natural species.

Not that Darwin’s conception should to be misunderstood as war in the everyday sense, however, although this is a very popular interpretation; for one thing the adversaries in these Darwinian arm races, most often predator and prey, in general remain wholly unaware of any escalation in armaments and armour. Snakes, for example, have never sought to strengthen their venom, any more than their potential victims, most spectacularly the opossums that evolved to prey on them, made any conscious attempts to hone their blood-clotting agents. Today’s snake-eating opossums have extraordinary immunity to the venom of their prey purely because natural selection strongly favoured opossums with heightened immunity.

Of course, the case is quite different when we come to humankind. For it is humans alone who deliberately escalate their methods of attack and response and do so by means of technology. To talk of an “arms race” between species is therefore a somewhat clumsy metaphor for what actually occurs in nature – although Darwin is accurately reporting what he finds.

And there is another crucial difference between the Darwinian ‘arms race’ and the human variant. Competition between species is not always as direct as between predator and prey, and frequently looks nothing like a war at all. Indeed, it is more often analogous to the competitiveness of two hungry adventurers lost in a forest. For it may well be that both of our adventurers are completely unaware that somewhere in the midst of the forest there is a hamburger left on a picnic table. While neither adventurer may be aware of the presence of the other, yet they are – at least in a strict Darwinian sense – in competition, since if either one stumbles accidentally upon the hamburger, it happens that, and merely by process of elimination, the other has lost his chance of a meal. As competitors then, the faster walker, or the one with keener eyes, or the one with greatest stamina, will gain a very slight but significant advantage on the other. Thus, perpetual competition between individuals need never amount to war, or even to battles, and this is how Darwin’s ideas are properly understood.

In any case, such contests of adaptation, whether between predators and prey, or sapling trees racing towards the sunlight, can never actually be won. The rabbits may get quicker but the foxes must get quicker too, since if either species fails to adapt then it will not survive long. So it’s actually a perpetual if dynamic stalemate, with species trapped like the Red Queen in Alice Through the Looking-Glass, always having to keep moving ahead just to hold their ground – a paradox that evolutionary biologists indeed refer to as “the red queen hypothesis” 4.

We might still judge that both sides are advancing, since there is, undeniably, a kind of evolutionary progress, with the foxes growing craftier as the rabbits get smarter too, and so we might conclude that such an evolutionary ‘arms race’ is the royal road to all natural progress – although Darwin noted that other evolutionary pressures including, most notably sexual selection, has tremendous influence as well. We might even go further by extending the principle in order to admit our own steady technological empowerment, viewed objectively as being a by-product of our own rather more deliberate arms race. Progress thus assured by the constant and seemingly inexorable fight for survival against hunger and the elements, and no less significantly, by the constant squabbling of our warring tribes over land and resources.

Space Odyssey draws deep from the science of Darwinism, and spins a tale of our future. From bony proto-tool, slowly but inexorably, we come to the mastery of space travel. From terrestrial infants, to cosmically-free adults – this is the overarching story of 2001. But wait, there’s more to that first scene than immediately meets the eye. That space-craft which Kubrick cuts to; it isn’t just any old space-craft…

Look quite closely and you might see that it’s actually one of four space-craft, similar in design, which form the components of an orbiting nuclear missile base, and though in the film this is not as clear as in Arthur C. Clarke’s parallel version of the story (the novel and film were co-creations written side-by-side), the missiles are there if you peer hard enough.

So Space Odyssey is, at least on one level, the depiction of technological development, which, though superficially from first tool to more magnificent uber-tool (i.e., the spacecraft), is also – and explicitly in the novel – a development from the first weapon to what is, up to now, the ultimate weapon, and thus from the first hominid-cide to the potential annihilation of the entire human population. 5

Yet 2001, the year in the title, also magically heralds a new dawn for mankind: a dawn that, as with every other dawn, bursts from the darkest hours. The meaning therefore, as far as I judge it, is that we, as parts of nature, are born to be both creators and destroyers; agents of light and darkness. That our innate but unassailable evolutionary drive, dark as it can be, also has the potential to lead us to the film’s weirdly antiseptic yet quasi-mystical conclusion, and the inevitability of our grandest awakening – a cosmic renaissance as we follow our destiny towards the stars.

Asked in an interview whether he agreed with some critics who had described 2001 as a profoundly religious film, Kubrick replied:

“I will say that the God concept is at the heart of 2001—but not any traditional, anthropomorphic image of God. I don’t believe in any of Earth’s monotheistic religions, but I do believe that one can construct an intriguing scientific definition of God, once you accept the fact that there are approximately 100 billion stars in our galaxy alone, that its star is a life-giving sun and that there are approximately 100 billion galaxies in just the visible universe.”

Continuing:

“When you think of the giant technological strides that man has made in a few millennia—less than a microsecond in the cosmology of the universe—can you imagine the evolutionary development that much older life forms have taken? They may have progressed from biological species, which are fragile shells for the mind at best, into immortal machine entities—and then, over innumerable eons, they could emerge from the chrysalis of matter transformed into beings of pure energy and spirit. Their potentialities would be limitless and their intelligence ungraspable by humans.”

When the interviewer pressed further, inquiring what this envisioned cosmic evolutionary path has to do with the nature of God, Kubrick added:

“Everything—because these beings would be gods to the billions of less advanced races in the universe, just as man would appear a god to an ant that somehow comprehended man’s existence. They would possess the twin attributes of all deities—omniscience and omnipotence… They would be incomprehensible to us except as gods; and if the tendrils of their consciousness ever brushed men’s minds, it is only the hand of God we could grasp as an explanation.” 6

Kubrick was an atheist although unlike many atheists he acknowledged the religious impulse is an instinctual drive no less irrepressible than our hungers to eat and to procreate. This is so because at the irreducible heart of religion lies pure transcendence: the climbing up and beyond ordinary states of being. This desire to transcend whether by shamanic communion with the ancestors and animalistic spirits, monastic practices of meditation and devotion, or by brute technological means is something common to all cultures.

Thus the overarching message in 2001 is firstly that human nature is nature, for good and ill, and secondly that our innate capacity for reason will inexorably propel us to transcendence of our terrestrial origins. In short, it is the theory of Darwinian evolution writ large. Darwinism appropriated and repackaged as an updated creation story – a new mythology and surrogate religion that lends an alternative meaning of life. We will cease to worship nature or humanity, which is nature, it says, and if we continue to worship anything at all, our new icons will be representative only of Progress (capital P). Thus, evolution usurps god! Of course, the symbolism of 2001 can be given esoteric meaning too – indeed, there can never be a final exhaustive analysis of 2001 because like all masterpieces the full meaning is open to an infinitude of interpretations – and this I leave entirely for others to speculate upon.

In 1997, Arthur C. Clarke was invited by the BBC to appear on a special edition of the documentary series ‘Seven Wonders of the World’ (Season 2):

*

I have returned to Darwin only because his vision of reality has become the accepted one. Acknowledging that human nature is just another natural outgrowth, it is tempting therefore to look to Darwin for answers. However, as I touched upon in the previous chapter, though Darwinism as biological mechanism is extremely well-established science, interpretations that follow from those evolutionary principles differ, and this is especially the case when we try to understand patterns of animal behaviour: how much stress to place on our own biological origins remains an even more hotly debated subject. And if we are to adjudicate fairly then one important consideration must be where Darwin’s own ideas originated.

In fact, as with all great scientific discoveries, we can trace a number of precursors including the nascent theory of his grandfather Erasmus, a founder member of the Lunar Society, who wrote lyrically in his seminal work Zoonomia:

“Would it be too bold to imagine, that in the great length of time, since the earth began to exist, perhaps millions of ages before the commencement of the history of mankind, would it be too bold to imagine, that all warm-blooded animals have arisen from one living filament, which THE GREAT FIRST CAUSE endued with animality, with the power of acquiring new parts, attended with new propensities, directed by irritations, sensations, volitions, and associations; and thus possessing the faculty of continuing to improve by its own inherent activity, and of delivering down those improvements by generation to its posterity, world without end!” 7

So doubtless Erasmus sowed the seeds for the Darwinian revolution, although his influence alone does not account for Charles Darwin’s central tenet that it is “the struggle for existence” which provides, as indeed it does, one plausible and vitally important mechanism in the process of natural selection, and thus, a key component in his complete explanation for the existence of such an abundant diversity of species. But again, what caused Charles Darwin to suspect that “the struggle for existence” necessarily involved such “a war of all against all” to begin with?

Well, it turns out that he had borrowed the first idea of “the struggle for existence”, a phrase that he uses as his title heading chapter three of The Origin of Species, directly from Thomas Malthus 8. Interestingly, Alfred Russell Wallace, the less remembered co-discoverer of evolutionary natural selection, who reached his own conclusions independent of Darwin’s work, had also been inspired in part by thoughts of this same concept, which though ancient in origin was by then generally attributed to Malthus.

The notion of “a war of all against all” however traces back further, at least as far back as the English Civil War, and to the writings of highly influential political philosopher, Thomas Hobbes. 9 So it is indirectly from the writings of these two redoubtable Thomases that much our modern thinking about Nature and therefore, by extension, about human nature, has itself evolved. It is instructive therefore to examine the original context from which the formation and development of Hobbes and Malthus’s own ideas occurred; contributions that have been crucial to the evolution of not only evolutionary thinking, but foundational to the development of our post-enlightenment western civilisation. To avoid too much of a digression, I have decided to leave further discussion of Malthus and his continuing legacy for the addendum below, and here to focus attention on the thoughts and influence of Hobbes. But to get to Hobbes, who first devoted his attention to the study of the natural sciences and optics in particular, I will provide a brief diversion by way of my own subject, Physics.

*

The title of Thomas Pynchon’s most celebrated novel Gravity’s Rainbow published in 1973 darkly alludes to the ballistic flight path of Germany’s V2 rockets that fell over London during the last days of the Second World War. Pynchon was able to conjure up this provocative metaphor because by the time of the late twentieth century everyone already knew very well, and seemingly by their direct experience, how projectiles follow a symmetrical and parabolic arc. It is strange to think, therefore, that for well over a millennium people in the western world, including the most scholarly among them, had believed that motion followed a set of quite different laws, presuming the trajectory of a thrown object, rather than following any sweeping arc, must be understood instead as comprised of two quite distinct phases.

Firstly, impelled by a force this object was presumed to enter a stage of “unnatural motion” as it climbed away from the earth’s surface – its natural resting place – before having eventually run out of steam, when it abruptly falls back to earth under “natural motion”. This is indeed our most common sense view of motion – a view any child would instantly recognise and immediately comprehend – although as with many common sense views of the physical world, it is absolutely wrong.

This rather striking illustration of scientific progress was first brought to my attention by a university professor who worked it into an unforgettable demonstration at the beginning of a lecture on error analysis. On the blackboard he first sketched out the two competing hypotheses: a beautifully smooth arc captioned ‘Galileo’ and before it a pair of arrows up and then down labeled ‘Aristotle’. Obviously Galileo was about to win, but then came the punchline as he pulled out a balloon, slapped it at an approximate angle of forty-five degrees before we all watched it drift back to earth just as Aristotle rightly predicted. With tremendous glee he finally drew a huge chalk cross across Galileo, and declared the message (if you didn’t understand) that above and beyond all the other considerations, it is essential you first design your experiment and carry out your observations with due care! 10

*

Legend tells us that Newton was sitting under an apple tree in his garden, unable to fathom what force could maintain the earth in its orbit around the sun, when all of a sudden an apple fell and hit him on the head. And if this is a faithful account of Newton’s Eureka moment, then the symbolism is surely remarkable. We might even say that it was this fall of Newton’s apple that redeemed humanity after the original Fall; snapping Newton and by extension all humanity instantaneously from darkness into an Age of Reason. For if expulsion from Eden involved eating an apple, Newton’s apple paved the way for a new golden age. As poet Alexander Pope wrote so exuberantly: “Nature and Nature’s laws lay hid in night: God said, Let Newton be! and all was light.” 11

Of course Newton’s journey into light had not been a solo venture, and as he said himself, “if I have seen further, it is by standing on the shoulders of giants.” 12 The predecessors and contemporaries Newton pays homage to include Descartes, Huygens, and Kepler, although the name that stands tallest today is once again Galileo. For it was Galileo’s observations and insights that led more or less directly to what we describe today as Newton’s Laws, and in particular Newton’s First Law, which states (in various formulations) that objects remain in uniform motion or at rest unless acted upon by a force.

This deceptively simple law has many surprising consequences. It means that when we see an object moving faster and faster or slower and slower or – and this is an important point – changing its direction of motion, then there must be a force impelling it. Thus it follows that there is a requirement for a force to arc the path of the earth about the sun, and, likewise, one causing the moon to revolve about the earth; hence gravity. Conversely, if an object is at rest (or moving in a straight line at constant speed – the law makes no distinction) then we know the forces acting on it must be balanced in such a way as to cancel to zero. Thus, we can tell purely from any object’s motion whether the forces acting on it are ‘in equilibrium’ or not.

An alternative way of thinking about Newton’s First Law requires the introduction of a related idea called ‘inertia’. Inertia is the reluctance of any object to change its motion, and, it turned out that the more massive the object, the greater its inertia – here I am paraphrasing Newton’s Second law. What this means in practice is that if you set up a situation where there are no resistive forces then an object will travel continually with unchanging velocity. This completely counterintuitive discovery was arguably Galileo’s finest achievement and it is the principle that permits modern hyperloop technology – high speed maglev trains that run without friction through vacuum tunnels. It also permitted Galileo’s understanding of how the earth could revolve indefinitely around the sun and without us ever noticing.

While others falsely presumed that the birds would get left behind if the earth was in motion, Galileo saw that the earth’s moving platform was no different in principle from a moving ship, and that, like on board the ship, nothing gets left behind as it travels forward – this is easier to envisage if you think about being on a train or car and recall how it feels at constant speed, and how you sometimes cannot even tell whether the train you are on or the one on the other platform is moving.

Of course, when Galileo insisted on a heliocentric reality, he was directly challenging Papal authority and paid the inevitable price for his impertinence. Moreover, when he implored his opponents merely to look through his own telescope and see for themselves, they quickly declined the invitation. This is the nature of fundamentalism – not just religious variants but all forms. It is also in our own nature – this confirmation bias – to have little or no desire to learn we might be wrong about matters of central concern. So the Inquisition in Rome tried him instead, and naturally found him guilty, sentencing Galileo to lifelong house arrest with a strict ban on ever publishing anything again. Given the age, this was comparatively lenient; two decades earlier the Dominican friar and philosopher Giordano Bruno, who amongst other blasphemies had dared to suggest the universe had no centre and that the stars were just other suns surrounded by planets of their own, was burned at the stake.

Today, our temptation is to regard the Vatican’s hostility to Galileo’s new science as a straightforward attempt to deny the reality because it devalues the Biblical story which places not just earth, but the holy city of Jerusalem at the centre of the cosmos. However, Galileo’s heresy actually strikes a more fundamental blow, and one that challenges not just papal infallibility and the millennium-long Scholastic tradition – the tripartite dialectical synergy of Aristotle, Neoplatonism and Christianity – but by extension, the entire hierarchical establishment of the late medieval period and much more.

Prior to Galileo, as my professor illustrated so expertly with his hilarious balloon demonstration, the view had endured that all objects obeyed laws according to their inherent nature. Thus, rocks fell to earth because they were by nature ‘earthly’, whereas the sun and moon remained high above because they were made of more heavenly stuff. In short, things knew their place. By contrast, Galileo’s explanation is startlingly egalitarian. According to this new opinion, not only do all objects follow common laws – laws that apply even to celestial bodies like the planets and moon – but they are forced to do so because they are inherently inert. Not impelled by inner drives – a living essence – but compelled always and absolutely by external forces. At a stroke the universe is hereby reduced to mechanics; its inner workings utterly akin to a most highly elaborate mechanism. At a stroke, it is reasonable to say indeed, that Galileo killed the cosmos.

Now if Newton’s apple is a reworking of the Fall of Man as humanity’s redemption through scientific progress, then the best-known fable of Galileo (since the tale itself is again wholly apocryphal), is how he instructed someone to drop cannon balls of differing sizes from the Leaning Tower of Pisa in order to test how objects fell to earth, observing that they landed simultaneously on the grass together. The experiment itself was famously recreated by Apollo astronauts on the moon’s surface where without the hindrance of an atmosphere, it was found that even objects as shocking different as a hammer and a feather, do indeed accelerate at the same rate, landing in the dust at precisely the same instant. In fact, I have repeated the experiment myself stood on a desk in class with smaller objects and surrounded by bemused students, who unfamiliar with the principle, are reliably astonished; since intuitively we all believe that the heavier weights fall faster.

But my real point is this: Galileo’s thought experiment invokes a different Biblical reference. It is also a parable of sorts, reminding us all not to jump to unscientific assumptions and instead always “to do the maths”. But, in common with Newton’s apple it retells another myth from Genesis; in this case recalling the Tower of Babel, an architectural endeavour conceived when the people of the world had been united and hoped to build a short-cut to heaven. Afterwards, God decided to punish us all (as He likes to do) with a divide and conquer strategy; our divided nations further confused by a multiplicity of languages. But then along came Galileo to unite us again with his own gift, the application of a new universal language called mathematics. As he wrote:

Philosophy is written in this grand book, which stands continually open before our eyes (I say the ‘Universe’), but cannot be understood without first learning to comprehend the language and know the characters as it is written. It is written in mathematical language, and its characters are triangles, circles and other geometric figures, without which it is impossible to humanly understand a word; without these one is wandering in a dark labyrinth. 13

*

Thomas Hobbes was very well studied in the works of Galileo, and on his travels around Europe in the mid 1630s he may very well have visited the great man in Florence. 14 In any case, Hobbes fully adopts Galileo’s mechanistic conception of the universe and draws what he sees as its logical conclusion, interpolating from what is true for external nature and determining that this must also be true of human nature.

All human actions, Hobbes says, whether voluntary or involuntary, are the direct outcomes of physical bodily processes occurring inside our organs and muscles. 15 Of the precise mechanisms, he ascribes the origins to “insensible” actions that he calls “endeavours”; something he leaves for physiologists to study and comprehend. 16

Fleshing out this bio-mechanical model, Hobbes next explains how all human motivations – which he calls ‘passions’ – that must also function on the basis of these material processes are thereby reducible to forces of attraction and repulsion; in his own terms “appetites” and “aversions”. 17 Just like elaborate machines, Hobbes says, humans too operate in accordance with responses that entail either the automatic avoidance of pain or the increase of pleasure; the will being merely the overarching passion of all these lesser appetites.

Thus, having presented this strikingly modern conception of life as a whole and human nature in particular, which he has determined is inherently ‘selfish’ since concerned only with improving its own situation, Hobbes next considers what he calls “the natural condition of mankind” (or ‘state of nature’) which leads him to consider why “there is always war of everyone against everyone”:

Whatsoever therefore is consequent to a time of War, where every man is Enemy to every man; the same is consequent to the time, wherein men live without other security, than what their own strength, and their own invention shall furnish them withall. In such condition, there is no place for Industry; because the fruit thereof is uncertain; and consequently no Culture of the Earth; no Navigation, nor use of the commodities that may be imported by Sea; no commodious Building; no Instruments of moving, and removing such things as require much force; no Knowledge of the face of the Earth; no account of Time; no Arts; no Letters; no Society; and which is worst of all, continual fear, and danger of violent death; And the life of man, solitary, poor, nasty, brutish, and short. 18

According to Hobbes, this ‘state of nature’ becomes inevitable whenever our laws and social conventions cease to function and no longer protect us from our otherwise fundamentally rapacious selves. Once civilisation gives way to anarchy, then anarchy, according to Hobbes, is hell because our automatic drive to improve our own situation comes into direct conflict with every other human individual. And to validate his claim, Hobbes reminds us of the fastidious counter measures everyone takes to defend themselves against their fellows:

It may seem strange to some man, that has not well weighed these things; that Nature should thus dissociate, and render men apt to invade, and destroy one another: and he may therefore, not trusting to this Inference, made from the Passions, desire perhaps to have the same confirmed by Experience. Let him therefore consider with himself, when taking a journey, he arms himself, and seeks to go well accompanied; when going to sleep, he locks his doors; when even in his house he locks his chests; and this when he knows there be Laws, and public Officers, armed, to revenge all injuries shall be done him; what opinion he has of his fellow subjects, when he rides armed; of his fellow Citizens, when he locks his doors; and of his children, and servants, when he locks his chests. Does he not there as much accuse mankind by his actions, as I do by my words? 19

Not that Hobbes is making a moral judgment, since he regards all nature, drawing no distinctions for human nature, as equally compelled by the self-same ‘passions’ and in this ongoing war of all on all, objectively sees the world as value neutral. As he continues:

But neither of us accuse mans nature in it. The Desires, and other Passions of man, are in themselves no Sin. No more are the Actions, that proceed from those Passions, till they know a Law that forbids them; which till Laws be made they cannot know: nor can any Law be made, till they have agreed upon the Person that shall make it. 20

All’s fair in love and war because fairness isn’t the point. According to Hobbes, what matters are the consequences of actions, and this again is a strikingly modern stance. Finally, Hobbes wishes only to ameliorate the flaws he perceives in human nature, in particular selfishness, by constraining behaviour in accordance with what he deduces to be ‘laws of nature’: precepts and general rules found out by reason. This, says Hobbes, is the only way to overcome what is otherwise man’s sorry state of existence in which a perpetual war of all against all ensures everyone’s life is “nasty, brutish and short”. Thus to save us all from our ‘state of nature’, as he calls it, he demands that we conform to his more reasoned ‘laws of nature’.

In short, not only does Hobbes’ prognosis speak to the urgency of securing a social contract, but his whole thesis heralds our bio-mechanical conception of life and of the evolution of life. Indeed, following from the tremendous successes of the physical sciences, Hobbes’ radical faith in materialism, which would then have seemed shocking to many, has slowly come to seem quite commonsensical; so much so that it led philosopher Karl Popper to coin the phrase “promissory materialism”: adherents to the physicalist view relegating all concerns about gaps in understanding as just problems to be worked out in future – just as Hobbes does, of course, when he delegates the task of comprehending all human actions and “endeavours” to the physiologists.

*

But is it really is the case, as Hobbes declares, that individuals are controlled only by laws and social contracts. If so, then we might immediately wonder why acts of indiscriminate murder and rape are such comparatively rare crimes given that they are the toughest of all crimes to foil or to solve. In fact most people, most of the time, appear to prefer not to commit everyday atrocities, and it would be odd to suppose that they refrain purely because they fear arrest and punishment. Everyday experience tells us instead that most people simply don’t have very much inclination for committing violence or other serious criminal intent.

Moreover, if we look for supporting evidence of Hobbes’ conjecture then we can actually find an abundance that refutes him. We know for instance that the appalling loss of life in the trenches should have been far greater still was it not for a very deliberate lack of aim amongst the combatants. And this lack of zeal for killing even during the heat of battle turns out to be the norm as US General S. L. A. Marshall learned from firsthand accounts gathered at the end of World War II when he was tasked with debriefing thousands of returning GIs in order to learn more about their combat experiences. 21 What he actually discovered was almost incredible: not only had three-quarters of combatants never fired at the enemy even when under direct fire themselves, but amongst those who did only two-percent actually shot to kill.

Nor is this a modern phenomenon. At the end of Battle of Gettysburg during the American Civil War, the Union Army collected up the tens of thousands of weapons and discovered that the vast majority were loaded. More than half of the rifles had multiple loads – one had 23 loads packed all the way up the barrel. 22 Many of these soldiers had never actually pulled the trigger; the majority preferring to feign combat rather literally than fire off shots.

Indeed it transpires that contrary to the depictions of battles in Hollywood movies, by far the majority of men take no pleasure at all in killing one another. Modern military training from Vietnam onwards has developed methods to compensate for the natural lack of bloodlust: heads are shaven, identities stripped, and conscripts are otherwise desensitized, turning men into better machines for war. But then, if there is one day in history more glorious than any other then surely it has to be the Christmas Armistice of 1914. The bloodied and muddied troops huddling for warmth in no-man’s land, sharing food, singing carols together, and playing the most beautiful game of football ever played: such outpourings of sanity in the face of lunacy that no movie screenplay could invent such a scene that did not appear impossibly sentimental and clichéd.

*

In his autobiography Hobbes famously relates that his mother’s shock at hearing the news of the Spanish Armada led to his premature birth, saying: “my mother gave birth to twins: myself and fear.” Doing his best to avoid getting caught up in the English Civil War, Hobbes certainly did live through exceptionally fearful times, which accounts for why his entire political theory is a response to fear with a tolerance for tyranny. Because Hobbes understood clearly that the power to protect is derived from the power to terrify; indeed to kill. In fact, Hobbes manages to conceive of a system of government whose authority is sanctified by terrifying its own subjects to consent in their own subjugation. On the same basis when a highwayman demands “your money or your life?” then if you agree you have entered into a Hobbesian contract! This is government by protection racket; his keenness for an overarching unassailable but benign dictator perhaps best captured by the absolute power he grants the State right down to the foundational level of determining what is moral:

I observe the Diseases of a Common-wealth, that proceed from the poison of seditious doctrines; whereof one is, “That every private man is Judge of Good and Evil actions.” This is true in the condition of mere Nature, where there are no Civil Laws; and also under Civil Government, in such cases as are not determined by the Law. But otherwise, it is manifest, that the measure of Good and Evil actions, is the Civil Law… 23

Remember that for Hobbes every action proceeds from a mechanistic cause and so the very concept of ‘freedom’ had struck him merely as a logical fallacy – and as someone who once had a bitter mathematical dispute with Oxford professor John Wallis after Hobbes erroneously claimed to be able to square the circle 24, his dismissal of ‘freedom’ is certainly fitting:

[W]ords whereby we conceive nothing but the sound, are those we call Absurd, insignificant, and Non-sense. And therefore if a man should talk to me of a Round Quadrangle; or Accidents Of Bread In Cheese; or Immaterial Substances; or of A Free Subject; A Free Will; or any Free, but free from being hindred by opposition, I should not say he were in an Error; but that his words were without meaning; that is to say, Absurd. 25

According to Hobbes then, freedom reduces absurdity – a round quadrangle! – which immediately opens the door to totalitarian rule: and no thinker was ever so willing as Hobbes to sacrifice freedom for the sake of security.

But Hobbes is mistaken once again, as one now famous experiment first carried out by psychologist Stanley Milgram – and since repeated many times – amply illustrates. For those unfamiliar with Milgram’s experiment, here is the set up:

Volunteers are invited to what they are told is a scientific trial investigating the effects of punishment on learning. Having been separated into groups, they are then assigned the roles either of teachers and learners. At this point, the learner is strapped into a chair and fitted with electrodes before in an adjacent room the teacher is given control of apparatus that enables him or her to deliver electric shocks. In advance of this, the teachers are given a low voltage sample shock just to give them a taste of the punishment they are about to inflict.

The experiment then proceeds with the teacher administering electric shocks of increasing voltage which he or she must incrementally adjust to punish wrong answers. As the scale on the generator approaches 400V, a marker reads “Danger Severe Shock” and beneath the final switches there is simply XXX. Proceeding beyond this level evidently runs the risk of delivering a fatal shock, but in the experiment participants are encouraged to proceed nonetheless.

How, you may reasonably wonder, could such an experiment have been ethically sanctioned? Well, it’s a deception. All of the learners are actors, and their increasingly desperate pleading is just scripted as are their screams. Importantly, however, the true participants (who are all assigned as ‘teachers’) are led to believe the experiment and the shocks are real.

The results – repeatable ones, as I say – are extremely alarming: two-thirds of the subjects go on to deliver what they are told are potentially fatal shocks. In fact, the experiment is continued until a teacher has administered three shocks at 450V level, by which time the actor playing the learner has already stopped screaming and must therefore be presumed either unconscious or dead. “The chief finding of the study and the fact most urgently demanding explanation”, Milgram wrote later, is that:

Ordinary people, simply doing their jobs, and without any particular hostility on their part, can become agents in a terrible destructive process. Moreover, even when the destructive effects of their work become patently clear, and they are asked to carry out actions incompatible with fundamental standards of morality, relatively few people have the resources needed to resist authority. 26

Milgram’s experiment has sometimes been presented as proof of our innate human capacity for cruelty and for doing evil. But this was neither the object of his study nor the conclusion Milgram makes. The evidence instead leads him to conclude that the vast majority take no pleasure from inflicting suffering or harm to others, but that they will do it when placed under a certain kind of duress and especially when an authority figure is instructing them to:

Many of the people were in some sense against what they did to the learner, and many protested even while they obeyed. Some were totally convinced of the wrongness of their actions but could not bring themselves to make an open break with authority. They often derived satisfaction from their thoughts and felt that – within themselves, at least – they had been on the side of the angels. They tried to reduce strain by obeying the experimenter but “only slightly,” encouraging the learner, touching the generator switches gingerly. When interviewed, such a subject would stress that he “asserted my humanity” by administering the briefest shock possible. Handling the conflict in this manner was easier than defiance. 27

Milgram thought that it is this observed tendency for compliance amongst ordinary people that enabled the Nazi’s crimes and that led to the Holocaust. Milgram’s study also helps to account for why those WWI soldiers, even after sharing food and songs with the enemy, returned ready to fight on in the hours, days, weeks and years that followed the Christmas Armistice. Disobedience was often severely punished, of course, with the ignominy of a court martial before execution by firing squad, but authority alone is generally enough to ensure compliance. Most people will just follow orders.

In short, what Milgram’s study shows is that Hobbes’ solution is, at best, deeply misguided, because it is authoritarianism (his remedy) that leads ordinary humans to commit some of the greatest atrocities. So Milgram offers us a way of considering Hobbes from a top down perspective: addressing the issue of how obedience to authority influences human behaviour.

But what about the bottom up view? After all, this was Hobbes’ favoured approach, since he very firmly believed (albeit incorrectly) that his own philosophy was solidly underpinned by mathematics – his grandest ambition had been to derive a social philosophy that followed logically and directly from the theorems of Euclid. Thus, according to Hobbes’ derived but ‘promissory materialism’, which sees Nature as wholly mechanistic and reduces actions to impulse, all animal behaviours – including human ones – are fully accountable and ultimately determined by, to apply a modern phrase, ‘basic instincts’. Is this true? What does biology have to say on the matter, and most specifically, what are the findings of those who have closely studied animal behaviour?

*

This chapter is concerned with words rather than birds…

So writes British ornithologist David Lack who devoted much of his life to the study of bird behaviour, conducting field work for four years while he also taught at Dartington Hall School in Devon; his spare-time spent observing populations of local robins; his findings delightfully written up in a seminal work titled straightforwardly The Life of the Robin. The passage I am about to quote follows on from the start of chapter fifteen in which he presents a thoughtful aside under the heading “A digression upon instinct”. It goes on:

A friend asked me how swallows found their way to Africa, to which I answered, ‘Oh, by instinct,’ and he departed satisfied. Yet the most that my statement could mean was that the direction finding of migratory birds is part of the inherited make-up of the species and is not the result of intelligence. It says nothing about the direction-finding process, which remains a mystery. But man, being always uneasy in the presence of the unknown, has to explain it, so when scientists abolish the gods of the earth, of lightning, and of love, they create instead gravity, electricity and instinct. Deification is replaced by reification, which is only a little less dangerous and far less picturesque.

Frustrated by the types of misunderstanding generated and perpetuated by misuse of the term ‘instinct’, Lack then ventures at length into the variety of ambiguities and mistakes that accompany it both in casual conversation or academic contexts; considerations that lead him to a striking conclusion:

The term instinct should be abandoned… Bird behaviour can be described and analysed without reference to instinct, and not only is the word unnecessary, but it is dangerous because it is confusing and misleading. Animal psychology is filled with terms which, like instinct, are meaningless, because so many different meanings have been attached to them, or because they refer to unobservables or because, starting as analogies, they have grown into entities. 28

When I first read Lack’s book I quickly fell under the spell of his lucid and nimble prose and marvelled at how the love for his subject was infectious. As ordinary as they may seem to us, robins live surprisingly complicated lives, and all of this was richly told, but what stood out most was Lack’s view on instinct: if its pervasive stink throws us off the scent in our attempts to study bird behaviour, then how much more alert must we be to its bearing on perceived truths about human psychology? Lack ends his own brief digression with a germane quote from philosopher Francis Bacon that neatly considers both:

“It is strange how men, like owls, see sharply in the darkness of their own notions, but in the daylight of experience wink and are blinded.” 29

*

The wolves of childhood were creatures of nightmares. One tale told of a big, bad wolf blowing your house down to eat you! Another reported a wolf sneakily dressing up as an elderly relative and climbing into bed. Just close enough to eat you! Still less fortunate was the poor duck in Prokofiev’s enchanting children’s suite Peter and the Wolf, swallowed alive and heard in a climatic diminuendo quaking from inside his belly. When I’d grown a little older, I also came to hear about stories of werewolves that sent still icier dread coursing down my spine…

I could go on and on with similar examples because wolves are invariably portrayed as rapacious and villainous throughout folkloric traditions across the civilised world of Eurasia, which is actually quite curious when you stop to think about it. Curious because wolves are not especially threatening to humans and wolf attacks are comparatively rare occurrences – while other large animals including bears, all of the big cats, sharks, crocodiles, and even large herbivores like elephants and hippos, pose a far greater threat to us. To draw an obvious comparison, polar bears habitually stalk humans, and yet rather than being terrifying we are taught to see them as cuddly. Evidently, our attitudes towards the wolf have been shaped, therefore, by factors other than the observed behaviour of wolves themselves.

So now let us consider the rather extraordinary relationship our species actually has with another large carnivore: man’s best friend and cousin of the wolf, the dog – and incidentally, dogs kill (and likely have always killed) a lot more people than wolves.

The close association between humans and dogs is incredibly ancient. Dogs are very possibly the first animal humans ever domesticated, becoming so ubiquitous that no society on earth exists that hasn’t adopted them. This adoption took place so long ago in prehistory that conceivably it may have played a direct role in the evolutionary development of our species; and since frankly we will never know the answers here, I feel free to speculate a little. So here is my own brief tale about the wolf…

One night a tribe was sat around the campsite finishing off the last of their meal as a hungry wolf secretly watched on. A lone wolf, and being a lone wolf, she was barely able to survive. Enduring hardship and eking out a precarious existence, this wolf was also longing for company. Drawn to the smell of the food and the warmth of the fire, this wolf tentatively entered the encampment and for once wasn’t beaten back with sticks or chased away. Instead one of the elders at the gathering tossed her a bone to chew on. The next night the wolf returned, and the next, and the next, until soon she was welcomed permanently as one of the tribe: the wolf at the door finding a new home as the wolf by the hearth.

As a story, it sounds plausible enough that something like it may have happened countless times perhaps and in many locations. Having enjoyed the company of the wolf, the people of the tribe later adopting her cubs (or perhaps it all began with cubs). In any case, as the wolves became domesticated they changed, and within just a few generations of selective breeding, had been fully transformed into dogs.

The rest of the story is more or less obvious too. With dogs, our ancestors enjoyed better protection and could hunt more efficiently. Dogs run faster, have far greater endurance, keener hearing and smell. Soon they became our fetchers and carriers too; our dogsbodies. Speculating a little further, our symbiotic relationship might also have opened up the possibility for evolutionary development at a physiological level. Like cave creatures that lose pigmentation and in which eyesight atrophies to favour greater tactile sense or sonar 30, we likewise might have reduced acuity in those senses we needed less, as the dogs compensated for our loss, which might then have reset our brains to other tasks. Did losses in our faculties of smell and hearing enable more advanced dexterity and language skills? Did we perhaps also lose our own snarls to replace them with smiles?

I shan’t say much more about wolves, except that we know from our close bond with dogs that they are affectionate and loyal creatures. So why did we vilify them as the “big, bad wolf”? My hunch is that they represent symbolically, something we have lost, or perhaps more pertinently, that we have repressed in the process of our own domestication. In a deeper sense, this psychological severance involved our alienation from all of nature. It has caused us to believe, like Hobbes, that all of nature is nothing but rapacious appetite, red in tooth and claw, and that morality must therefore be imposed upon it by something other; that other being human rationality.

Our scientific understanding of wolf behaviour has been radically overturned. Previously accepted beliefs that wolves compete for dominance by becoming alpha males or females turn out to be largely untrue. Or at least this happens only if unrelated wolves are kept in captivity. In all cases where wolves are studied in their natural environment, the so-called ‘alpha’ wolves are just the parents – in other words, wolves form families just like we do:

*

One school views morality as a cultural innovation achieved by our species alone. This school does not see moral tendencies as part and parcel of human nature. Our ancestors, it claims, became moral by choice. The second school, in contrast, views morality as growing out of the social instincts that we share with many other animals. In this view, morality is neither unique to us nor a conscious decision taken at a specific point in time: it is the product of gradual social evolution. The first standpoint assumes that deep down we are not truly moral. It views morality as a cultural overlay, a thin veneer hiding an otherwise selfish and brutish nature. Perfectibility is what we should strive for. Until recently, this was the dominant view within evolutionary biology as well as among science writers popularizing this field. 31

These are the words of Dutch primatologist Frans de Waal, who became one of the world’s leading experts in chimpanzee behaviour. Based on his studies, de Waal applied the term “Machiavellian intelligence” to describe the variety of cunning and deceptive social strategies used by chimps. A few years later, however, de Waal came across their and our pygmy cousins the bonobos that were also captive in a zoo in Holland, and says they had an immediate effect on him:

“[T]hey’re totally different. The sense you get looking them in the eyes is that they’re more sensitive, more sensual, not necessarily more intelligent, but there’s a high emotional awareness, so to speak, of each other and also of people who look at them.” 32

Sharing a common ancestor with bonobos and chimps, humans are in fact equally closely-related to both species, and interestingly when de Waal was asked do you think we’re more like bonobo or chimp he replied:

“I would say there are people in this world who like hierarchies, they like to keep people in their place, they like law enforcement, and they probably have a lot in common, let’s say, with the chimpanzee. And then you have other people in this world who root for the underdog, they give to the poor, they feel the need to be good, and they maybe have more of this kinder bonobo side to them. Our societies are constructed around the interface between those two, so we need both actually.” 33

De Waals and others who have studied primates are often astonished by the kinship with our own species. When we look deep into the eyes of chimps, gorillas, or even those of our dogs, we find ourselves reflected in every way. It’s not hard to fathom where morality came from, and the ‘veneer theory’ of Hobbes reeks of a certain kind of religiosity, infused with a deep insecurity born of the hardship and terrors of civil strife.

*

New scientific studies are proving that primates, elephants, and other mammals including dogs also show empathy, cooperation, fairness and reciprocity. That morality is an aspect of nature. Here Frans de Waal shares some surprising videos of behavioral tests that show how many of these moral traits all of us share:

*

II       Between two worlds

I was of three minds,

Like a tree

In which there are three blackbirds

— Wallace Stevens 35

*

Of all the creatures on earth, apart from a few curiosities like the kangaroo and giant pangolin, or some species of long-since extinct dinosaurs, only the birds share our bipedality. The adaptive advantage of flight is so self-evident that there’s no need to ponder why the forelimbs of birds morphed into wings, but the case for humans is more curious. Why it was that about four million years ago, a branch of hominids chose to stand on two legs rather than four, enabling them to move quite differently from our closest living relatives (bonobos and chimps) with all of the physiological modifications this involved, still remains a mystery. But what is abundantly clear and beyond all speculation is that this single evolutionary change freed up our hands for purposes no longer restricted by their formative locomotive demands, and that having liberated our hands, not only did we become supreme manipulators of tools, but this sparked a parallel growth in intelligence, causing us to become supreme manipulators per se – the very etymological root of the word coming from ‘man-’ meaning ‘hand’ of course.

With our evolution as manual apes, humans also became constructors, and curiously here is another trait that we have in common with many species of birds. That birds are able to build elaborate structures to live in is indeed a remarkable fact, and that they necessarily achieve this by organising and arranging the materials using only their beaks is surely more remarkable again. Storks with their ungainly bills somehow manage to arrange large piles twigs so carefully that their nests often overhang impossibly small platforms like the tips of telegraph poles. House martins construct wonderfully symmetrical domes just by patiently gluing together globules of mud. Weaver birds, a range of species similar to finches, build the most elaborate nests of all, and quite literally weave their homes from blades of grass. How they acquired this ability remains another mystery, for though recent studies have found that there is a degree of learning involved in the styles and manner of construction, this general ability of birds to construct nests is an innate one. According to that throwaway term, they do it ‘by instinct’. By contrast, in one way or another, all human builders must be trained. As with so much about us, all our constructions are therefore cultural artefacts.

*

With very few exceptions, owls have yellow eyes. Cormorants instead have green eyes. Moorhens and coots have red eyes. The otherwise unspectacular satin bowerbird has violet eyes. Jackdaws sometimes have blue eyes. Blackbirds have extremely dark eyes – darker even than their feathers – jet black pearls set within a slim orange annulus which neatly matches their strikingly orange beaks. While eye colour is common to birds within each species, the case is clearly different amongst humans, where eye colour is one of a multitude of variable physical characteristics including natural hair and skin colour, facial characteristics, and height. Nonetheless, as with birds and other animals where there is significant uniformity, most of these colourings and other identifying features are physical expressions of the individual’s genetic make-up or genotype; an outward expression of genetic inheritance known technically as the phenotype.

Interestingly, for a wide diversity of species, there is an inheritance not only of morphology and physiology but also of behaviour. Some of these behavioural traits may then act in turn to shape the creature’s immediate environment – so the full phenotypic expression is often observed to operate outside and far beyond the body of the creature. These ‘extended phenotypes’ as Dawkins calls them are discovered within such wondrous but everyday structures as spider’s webs, delicate tube-like homes formed by caddis fly larvae, the larger scale constructions of beaver’s dams and of course bird’s nests. It is reasonable therefore to speculate on whether the same evolutionary principle applies to our human world.

What, for instance, of our own houses, cars, roads, bridges, dams, fortresses, cathedrals, systems of knowledge, economies, music and other works of art, languages…? Once we have correctly located our species as just another of amongst many, existing at a different tip of an otherwise unremarkable branch of our undifferentiated evolutionary tree of life, why wouldn’t we judge our own designs as similarly latent expressions of human genes interacting with their environment? Indeed, Dawkins addresses this point directly and points out that tempting as it may be, such broadening of the concept of phenotype stretches his ideas too far, since, to offer his own example, scientific justification must then be sought for genetic differences between the architects of different styles of buildings! 36

In fact, the distinction here is clear: artefacts of human conception which can be as wildly diverse as Japanese Noh theatre, Neil Armstrong’s footprints on the moon, Dadaist poetry, recipes for Christmas pudding, TV footage of Geoff Hurst scoring a World Cup hat-trick, and as mundane as flush toilets, or rarefied as Einstein’s thought experiments, are all categorically different from such animal artefacts as spider’s webs and beaver’s dams. They are patterns of culture not nature. Likewise, all human behaviour right down to the most ephemeral including gestures, articulations and tics, is profoundly patterned by culture and not fully shaped only by pre-existing and underlying patterns within our human genotypes.

Vocabulary – another human artefact – makes this plain. We all know that eggs are ‘natural’ whereas Easter eggs are distinguishable as ‘artificial’, and that the eye is ‘natural’ while cameras are ‘technological’ with both of our antonyms deriving roots in words for ‘art’. Which means that while ‘nature’ is a strangely slippery noun that in English points to a whole host of interrelated objects and ideas, it is found nonetheless that throughout other languages equivalent words do exist to distinguish our manufactured worlds – of arts and artifice – from the surrounding physical world comprised solely of animals, plants and landscapes. A reinvention of this same word-concept that occurs for a simple yet important reason: the difference it labels is inescapable.

*

As a species, we are incorrigibly anthropomorphising; constantly imbuing the world with our own attributes and mores. Which brings up a related point: what animal besides the human is capable of reimagining things in order to make them conform to any preconceived notion of any kind? Dogs may mistake us as other dogs – although I doubt this – but still we are their partners within surrogate packs, and thus, in a sense, surrogate dogs. But from what I know of dogs, their world altogether more direct. Put simply it is… stick chasing… crap taking… sleep sleeping… or (best of all) going for a walk, which again is more straightforwardly being present on an outdoor exploration! In short, dogs live so close to the passing moment, because they have nowhere else to live. Yet humans mostly cannot. Instead we drift in and out of our past or in anticipation of our future. Recollections and goals fill our thoughts repeatedly and it is exceedingly difficult to attend fully to the present.

Moreover, for us the world is nothing much without other humans. Without culture, any world worthy of the name is barely conceivable at all, since humans are primarily creatures of culture. Yes, there would still be the wondrous works of nature, but no art beyond, and no music except for the occasional bird-song and the wind in the trees: nothing but nothing beyond the things-in-themselves that surround us, and without other humans, no need to communicate our feelings about any of this. In fact, there could be no means to communicate at all, since no language could ever form in such isolation. Instead, we would float through a wordless existence, which might be blissful or grindingly dull, but either way our sense impressions and emotions would remain unnamed.

So it is extremely hard to imagine any kind of world without words, although such a world quite certainly exists. It exists for animals and it exists in exceptional circumstances for humans too. The abandoned children who have been nurtured by wild animals (very often wolves) provide an uneasy insight into this world beyond words. So too, for different reasons, do a few of the profound and congenitally deaf. On very rare occasions, these children have gone on to learn how to communicate, and when this happens, what they tell us is just how important language is.

*

In his book Seeing Voices, neurologist Oliver Sacks, describes the awakening of a number of such remarkable individuals. One such was Jean Massieu. Almost without language until the age of fourteen, Massieu had become a pupil at Roch-Ambroise Cucurron Sicard’s pioneering school for the deaf. Astonishingly, he had eventually become eloquent in both sign language and written French.

From Sicard’s own description, Sacks considers Massieu’s steep learning curve, and sees how similar it is to Sack’s own experience with a deaf child. By attaching names to objects in pictures that Massieu had drawn, Sicard managed to open the young man’s eyes. Labels that, to begin with, left his pupil “utterly mystified”, but then suddenly Massieu had “got it”. And Sacks describes how he had understood not merely the abstract connection between the pencil lines of his own drawing and the seemingly incongruous additional strokes of his tutor’s labels, but, just as immediately, he had recognised the value of such a tool: “… from that moment on, the drawing was banished, we replaced it with writing.”

The most magical part of Sack’s retelling comes in the description of Massieu and Sicard’s walks together through the woods. “He didn’t have enough tablets and pencils for all the names with which I filled his dictionary, and his soul seemed to expand and grow with these innumerable denominations…” Sicard later wrote.

Massieu’s epiphany brings to mind the story of Adam with his naming of all the animals in Eden, and Sacks tells us:

“With the acquisition of names, of words for everything, Sicard felt, there was a radical change in Massieu’s relation to the world – he had become like Adam: ‘This newcomer to earth was a stranger on his own estates, which were restored to him as he learned their names.’” 37

It is this gift for language that most obviously sets us most apart from other creatures. Not that humans invented language from scratch, of course, since it grew up both with us and within us: one part phenotype and one part culture. It evolved within other species too, but for reasons unclear, we excelled, and as a consequence became adapted to live in two worlds, or as Aldous Huxley preferred to put it: we have become “amphibian”, in that we simultaneously occupy “the given and the home-made, the world of matter, life and consciousness and the world of symbols.” 38

Using words helps us to relate the present to the past. We reconstruct it or perhaps reinvent it. Likewise with words we envisage a future. This moves us outside Time. It makes us feel at home, helps us to heal past wounds and to prepare for future events. Correspondingly, it also detaches us from the present.

For whereas many living organisms exist entirely within their immediate physical reality, human beings occupy a parallel ideational space where we are almost wholly embedded in language. Now think about that for a moment… no really do!

Stop reading this.

Completely ignore this page of letters, and silence your mind.

Okay, close your eyes and turn your attention to absolutely anything you like and then continue reading…

So here’s my question: when you were engaged in your thoughts, whatever you thought about, did you use words at all? Very likely you literally “heard” them: your inner voice filling the silence in its busy, if generally unobtrusive and familiar way. Pause again and now contemplate the everyday noise of being oneself.

Notice how exceedingly difficult it is to exist if only for a moment without any recourse to language.

Perhaps what Descartes should have said is: I am therefore I think!

For as the ‘monkey mind’ goes wandering off, instantly the words have crept back into our mind, and with our words comes this detachment from the present. Every spiritual teacher knows this, of course, recognising that we cannot be wholly present to the here and now while our mind darts off to visit memories, wishes, opinions, descriptions, concepts and plans: the same memories, wishes, opinions, descriptions, concepts and plans that gave us an evolutionary advantage over our fellow creatures. He also understands that the real art of meditation cannot involve any direct attempt to silence our excitable thoughts, but only to ignore them.

It is evident therefore how in this essential way we are indeed oddly akin to amphibious beings since we occupy and move between two distinct habitats. Put differently, our sensuous, tangible outside world of thinginess (philosophers sometimes call this ‘sense data’) is totally immersed within the inner realms of language and symbolism. So when we observe a blob with eight thin appendages we probably see something spider-like. If we hate spiders then we are very likely to recoil from it. If we have a stronger aversion then we will recoil even after we are completely sure that it’s just a picture of a spider or, in extreme cases, a tomato stalk. On such occasions, our feelings of fear or disgust arise not as the result of failing to distinguish the likeness of a spider from a real spider, but from the power of our own imagination: we literally jump at the thought of a spider.

Moreover, words are sticky. They collect together in streams of association and these mould our future ideas. Religion = goodness. Religion = stupidity. If we hold the first opinion then crosses and pictures of saints will automatically generate a different affect than if we hold the latter. Or how about replacing the word ‘religion’ with say ‘patriotism’: obviously our perception of the world alters in a different way. In fact, just as the pheromones in the animal kingdom cause the direct transmission of behavioural effects to members of a species, the language secreted by humans is likewise capable of directly impacting the behaviour of others.

It has become our modern tendency automatically to suppose that the arrow which connects these strikingly different domains points unerringly in one direction: that language primarily describes the world, whereas the world as such is relatively unmoved by our descriptions of it. This is basically the presumed scientific arrangement. By contrast, any kind of magical reinterpretation of reality involves a deliberate reversal of the direction of the arrow such that all symbols and language are treated as potent agents that might actively cause change within the material realm. Scientific opinion holds that this is false, and yet, on a deeply personal level, language and symbolism not only comprise the living world, but do quite literally shape and transform it. As Aldous Huxley writes:

“Without language we should merely be hairless chimpanzees Indeed, we should be something much worse. Possessed of a high IQ but no language, we should be like the Yahoos of Gulliver’s Travels—creatures too clever to be guided by instinct, too self-centred to live in a state of animal grace, and therefore condemned to remain forever, frustrated and malignant, between contented apehood and aspiring humanity. It was language that made possible the accumulation of knowledge and the broadcasting of information. It was language that permitted the expression of religious insight, the formulation of ethical ideals, the codification of laws. It was language, in a word, that turned us into human beings and gave birth to civilization.” 39

*

As I look outside my window I see a blackbird sitting on the TV aerial of a neighbouring rooftop. This is what I see, but what does the blackbird see? Obviously I cannot know for certain though merely in terms of what he senses, we know that his world is remarkably different from ours. For one thing, birds have four types of cone cells in the retinas of their eyes while we have only three. Our cone cells collect photons centred on red, green and blue frequencies and different combinations generate a range of colours that can be graphically mapped as a continuously varying two-dimensional plain of colours, however if we add another colour receptor then the same mapping requires an additional axis that extends above the plain. For this reason we might justifiably say that the bird sees colours in ways that differ not merely by virtue of the extent of the detectable range of frequencies, but that a bird’s vision involves a range of colour combinations of a literally higher dimension.

Beyond these immediate differences in sense data, there is another way in which a bird’s perceptions – or more strictly speaking its apperceptions – are utterly different from our own, for though the blackbird evidently sees the aerial, it does not recognise it as such. Presumably it sees nothing beyond a convenient metal branch to perch upon decked with unusually regular twigs. This is not to disparage the blackbird for its lesser intelligence, but to respect the fact that even the most intelligent all blackbirds is incapable of knowing more, since this is all any bird can ever understand about the aerial.

No species besides our own is capable of discovering why the aerial was actually put there, or how it is connected to an elaborate apparatus that turns the invisible signals it captures into pictures and patterns of sounds, leave aside gathering the knowledge of how metal can be manufactured by smelting rocks or the still more abstruse science of electromagnetism.

My point here is not to disparage the blackbird’s inferior intellect, since it very possibly understands things that we cannot; but to stress how we are unknowingly constrained in ways we very likely share with the bird. As Hamlet cheeks his friend: “There are more things in heaven and earth, Horatio, than are dreamt of in your philosophy.”

Some of these things – and especially the non-things! – may slip us by forever as unknown unknowns purely by virtue of their inherently undetectable nature. Others may be right under our nose and yet, just like the oblivious bird perched on its metal branch who can never consider reasons for why it is there, we too may lack any capacity even to understand that there is any puzzle at all.

*

I opened the chapter with a familiar Darwinian account of human beings as apex predators struggling for survival on an ecological battlefield; perpetually fighting over scraps, and otherwise competing over a meagre share of strictly limited resources. It is a vision of reality solidly founded upon an overarching belief in scientific materialism, and although a rather depressing vision, it is certainly the prevailing orthodoxy – the Weltanschauung of our times – albeit seldom expressed so antiseptically as it might be.

Indeed, to boil this down further, as doctrinaire materialist hardliners really ought to insist, we might best comprehend ourselves as biological robots. Why robots? Because according to the doctrine we are genetically coded not for experiences, or even merely for survival, but solely for reproductive success – and evolved to function for just such time as to fulfill this primary objective. Our death is then as inconsequential as it is inevitable.

Indeed, propagation of every species goes on blindly until such time as the species as a whole inevitably becomes extinct. If this process is extended by technological means beyond even the death of the earth and solar system, then it will end when the entire universe succumbs to its own overarching and insignificant end. No amount of space colonisation will save us here.

More nakedly told, it is not merely that, as Nietzsche famously lamented, “God is dead”, which has some upsides, but, that while richly animated, there is nothing going on whatsoever besides machine process, anywhere in this universe or the next. This reduction of the cosmos to machine process is Hobbes’ vision in a nutshell too.

In common with the old religions, the boundaries of this new mechanistic belief system extend boundless and absolute and thereby encompass whatever remnants of any god or gods we might try to salvage. There remains no location for any god within, or even the apparatus to exercise free will. Virtue, compassion and love are all epiphenomenal illusions. Redemption comes in the form only of a compensatory genetic subroutine compelling us to carry on regardless of the painful irrelevance of our human situation.

Unsurprisingly, we seldom reflect on the deep existential ramifications of our given materialist mythos, which is, for the most part, unconsciously inculcated; and almost no-one lives a life in strict nihilistic accord. Instead, we mostly bump along trying to be good people (a religious hangover presumably), with an outlook that approximates the one most succinctly expressed by Morty Smith: “Nobody exists on purpose, nobody belongs anywhere, everybody’s gonna die. Come watch TV.” 39a

*

III      Blinded by history

“All history is nothing but a continuous transformation of human nature”

— Karl Marx 40

*

History, someone once joked, is just one damn thing after another! A neat one-liner, disassembling history, as it does, into its component and frequently terrible events, which then follow in sequence with little more intent than the random footsteps of the drunkard. Progress may be admitted in both cases, of course, for in spite of deficiencies in our sense of direction we generally make it home.

However, to view history in such a wholly disjointed way is also to desiccate it, although such a vulgar reductio ad absurdum is also the reason, of course, the joke is amusing. For why bother studying history at all when it makes so little sense? History, thus reduced, is surely bunk, and yet history at school has traditionally been taught very much like this: as just one thing after another…

Real historians make their living by joining up the dots instead, and attempting to put flesh back on the bones by reconstructing the past much like palaeontologists reconstruct dinosaurs. But here again there are dangers. After all when you’ve only got bones you’ve got to add the muscle and skin to your tyrannosaurus rex, and these have to be included on the basis of what you know about living, or at least, less extinct creatures. So when I was still a child, I learnt about an enormously long, herbivorous monster called the brontosaurus, whereas, as it now transpires, no such creature ever walked the Earth… at least not quite such a creature. Its discoverer, Othniel Charles Marsh, in his rush to establish a new species, had accidentally got the bones jumbled up. Worse than this, Marsh, having excavated an almost complete skeleton, though one lacking just a skull, had creatively added a composite head constructed from finds at different locations. As it transpires then, the brontosaurus that he thought he’d discovered was just an adult specimen of an already classified group, the apatosaurus.

What applies to reconstructions in palaeontology also applies, at least in general terms, to reconstructions of human history: the difference being that whereas palaeontologists rely on fossil records, historians pieces together the surviving records of a different kind: books, documents, diaries, and during more recent times, photographs and audio-visual recordings. When detailing and interpreting events beyond living memory (which is rather short) the historian then has to rely solely on documentary sources, since there is literally nothing else. This magnifies the difficulty faced by the historian, since, unlike bones and rocks, human records can frequently distort the truth (either wilfully or by accidents of memory).

How, then, does a scrupulous historian know which records to trust, especially if he encounters records that are in direct contradiction? How to ascribe greater reliability to some records over others? Or to determine whether any newly unearthed record is reliable, unreliable, authentic or just a hoax? Well, here s/he must become a detective, and just as a police detective relies upon cross-examination to check facts and corroborate evidence from witnesses, so the diligent historian makes thorough cross-checks against his alternative sources. There is, of course, an ineluctable circularity to all this.

In 1983, when the Hitler Diaries turned up out of the blue, they were quickly authenticated by three different expert historians, Hugh Trevor-Roper, Eberhard Jäckel and Gerhard Weinberg. The diaries were shortly afterwards proven to be forgeries, and soon afterwards totally discredited by means of actual forensic analysis. Handwriting turned out to be the biggest give-away. But then Hitler had been dead a mere half a century, well within living memory, and so there were ample handwritten documents to compare his words against. Such unassailable forensic evidence is obviously the exception rather than the rule for the greatest tracts of history.

Historians have their work cut out, since getting the basic facts straight is just the start of the process. If History is to be a living subject then they must try not to leave out too much of the warm, moist uncertainty of the real lives that made it, even though the greater part of most past lives must inevitably be lost in history’s creases, whilst any History told as just one damn thing after another is History shrivelled up to the driest of husks. Indeed, as archaeologist and historian John Romer once elegantly put it: “History is only myth: stories trying to make sense of reality” 41

*

Two decades ago, I embarked an adventure to the USA. I was travelling with Neil, a friend and post-graduate colleague, to the International Conference on Asteroids, Comets and Meteors in Flagstaff, Arizona. We were wined and dined and given tours of the Grand Canyon and Meteor Crater. It was to be a most splendid jolly!

After the conference, we also took a tour a little further into the great continent. We hired a car and headed west on Route 66, only reaching our final destination, San Francisco, after a solid week of driving. Along the way, we had stopped to admire the great Hoover Dam, Las Vegas, Death Valley, Los Angeles, the giant redwoods and the towering rocks of Monument Valley which form such a spectacular backdrop to so many Westerns. En route we had also encountered the occasional roadside stalls where the Native Americans who sold trinkets to get by would try to entice passing trade with off-road signs and promises of dinosaur footprints.

On one of our excursions we visited that most famous of petrified forests, with its fossilised trees strewn like ancient bronze-casts, and then nearby, we wandered the ruined remains of human settlements. The ruins had signs too, ones that told us the houses were believed to have been built about six hundred years old or so, or, as the notes put it: “prehistoric”. Being Europeans we laughed, but of course we shouldn’t have. The idea that a mere six hundred years old could be designated “prehistoric” was not another fine example of dumb-ass American thinking, but a straightforward fact: history, as I said above, being a discipline that arises from documentation. Automatically, therefore, we, meaning all modern people, have, to put matters mildly, an historical bias.

Let’s be clear: Christopher Columbus did not discover America. For one thing, there were millions of people already living there. But Columbus wasn’t even the first European to sail to the ‘New World’. That honour more likely goes to Erik Thorvaldsson better known as Erik the Red, the Viking explorer credited in the Icelandic sagas with founding the first settlement in Greenland. Nor was Columbus the first European ever to set foot on continental American soil. The plaudits here should go instead to Thorvaldsson’s son, Lief Erikson, who according to the sagas established a Norse settlement in Vinland, now called Newfoundland. This all took place a full five centuries before the voyage of Genoese pretender Columbus.

What then did Columbus bring to our story, if not discovery? Well, the answer can be read in his lines of his captain’s log. This is what he writes about his first encounter with the Arawak Indians who inhabited the archipelago known today as the Bahamas:

They go as naked as when their mothers bore them, and so do the women, although I did not see more than one young girl. All I saw were youths, none more than thirty years of age. They are very well made, with very handsome bodies, and very good countenances… They neither carry nor know anything of arms, for I showed them swords, and they took them by the blade and cut themselves through ignorance… They should be good servants and intelligent, for I observed that they quickly took in what was said to them, and I believe they would easily be made Christians, as it appeared to me that they had no religion.

On the next day, Columbus then writes:

I was attentive, and took trouble to ascertain if there was gold. I saw that some of them had a small piece fastened in a hole they have in the nose, and by signs I was able to make out that to the south, or going from an island to the south, there was a king who had great cups full, and who possessed a great quantity.

The following day, a Sunday, Columbus decided to explore the other side of the island, and once again was welcomed by the villagers. He writes:

I saw a piece of land which appeared like an island, although it is not one, and on it there were six houses. It might be converted into an island in two days, though I do not see that it would be necessary, for these people are very simple as regards the use of arms, as your Highnesses will see from the seven that I caused to be taken, to bring home and learn our language and return; unless your Highnesses should order them all to be brought to Castile, or to be kept as captives on the same island; for with fifty men they can all be subjugated and made to do what is required of them. 42

Having failed in his quest for gold, Columbus subsequent expeditions sought out a different cargo to bring back to Spain. In 1495, they corralled 1,500 Arawak men, women and children in pens and selected the fittest five hundred specimens for transportation. Two hundred died onboard the ships and the survivors were all sold in slavery. Unfortunately for Columbus, however, and by turns for the native people of the Caribbean, this trade in humans was insufficiently profitable to pay back his investors, and so Columbus adopted a different strategy and intensified his search for gold again.

In Haiti, where he believed the precious metal lay in greatest abundance, Columbus soon demanded that everyone over the age of fourteen must find and exchange a quarterly tribute for a copper token. Failure to comply was severely punished with the amputation of limbs; victim left to bleed to death, and those who tried out of desperation to escape hunted down with dogs and then summarily executed.

Bartolome de las Casas, a young priest who had arrived to participate in the conquest and was indeed for a time a plantation owner, afterwards became an outspoken critic and reported on the many atrocities he witnessed. 43 In his own three-volume chronicle, History of the Indies, las Casas later wrote:

The Indians were totally deprived of their freedom and were put into the harshest, fiercest, most horrible servitude and captivity which no one who has not seen it can understand. Even beasts enjoy more freedom when they are allowed to graze in the field. 44

*

Napoleon has been attributed with the utterance that “History is written by the winners” or alternatively, “What is History but a fable agreed upon” 45, and for one with such a prodigious record both of winning and “making history”, who doubts that he knew whereof he spoke. Strange, therefore, the little attention paid to Napoleon’s straight-talking maxim. How instead we eagerly absorb the authorised versions of our histories, trusting that by virtue of scholastic diligence and impartiality, these reconstructions of the past represent a near facsimile to the actuality of the real events. But then, with regards to the centuries-long fractious infighting between the European monarchies, we are privy to the accounts of both adversaries. So here we generally have – at the very least – two sides to every tale of every conflict, scandal and criminal act. In stark contrast, of course, when the British and the other European powers first sailed to those unconquered lands soon after to be collectively known as “the colonies”, only one side of the story remains extant.

For during the period of the last five hundred years or so, times when western records have been most replete, a world once teeming with a diversity of alternative cultures, has been slowly wiped away: the people of these forgotten worlds either annihilated or wholly assimilated by the great European powers. Our rather homogeneous culture, by the terror of cannons and on other occasions by the softer coercions of the sermons of missionaries, has thereby steadily erased and replaced the heterogeneous confusion sometimes as swiftly as it was encountered. Defeated cultures, if not entire indigenous populations, not just swept aside and defeated, but utterly and irreversibly deleted.

Oral traditions leave little if anything in the way of an historical trace, and so back in the fifteenth century, America was indeed “prehistoric”; its history having first been established only when first the alien invaders (as the first Europeans must have appeared to the wide eyes of the native peoples they were about to overwhelm – as creatures from another world) stepped ashore. As in the Americas, so too in Australia and the other so-called “new worlds”, where, of the novelties we brought, perhaps the most significant was History itself.

When relying upon evidence from History, it is important therefore, to continually bear in mind that throughout most regions of the world, throughout almost all of human time, people didn’t actually have any. That all of History begins only with writing, which was a largely Eurasian preoccupation. Thus History in most parts of the world only began with our arrival: its origins, an indirect consequence of conquest, oppression, exploitation and enslavement.

Pulitzer-prize winning journalist, author and activist Chris Hedges discusses the teaching of history as a form of indoctrination with Professor James W. Loewen, author of ‘Lies My Teacher Told Me: Everything Your American History Textbook Got Wrong’:

*

I could at this juncture attempt to set out to list all the barbarisms of history, although to do justice I would need to at least double the length of the current chapter. Just a few examples will more than serve the purpose of illustrating the point…

From the North came the longboats of the Vikings intent on rape and pillage; from the East, the marauding Mongol horde, and the butchery of tyrants such as Vlad the Impaler; in the Mediterranean South, we were once entertained by the sadistic spectaculars of the Roman circuses, and then afterwards the more ideologically entrenched, atrocities of the Spanish Inquisition. When the first Europeans explored the lands of the West, the ruthless conquistadors came face to face with the blood-curdling savagery of the Aztec and Mayan empires. Which was the more dreadful?

In former times, the Christians marched thousands of miles to slaughter innocents in the name of the Prince of Peace, and, in astonishingly recent times, other Christians dispatched heathens and heretics by drowning, burning and lynching, especially at the height of the witch craze that swept Europe and America well into the Enlightenment period.

Muslims, by comparison, have generally preferred to kill in the name of Jihad and Fatwa, or else to inflict judicial cruelties by means of stoning, flagellation, amputation and decapitation, all in strict accordance to their holy Sharia Law. But then the irreligious are no less diabolical, whether we consider Hitler and the Nazi death camps, or the Soviet gulags, or the killing fields of Cambodia, and Mao Tse-tung’s “Cultural Revolution” in China. Given how little time has passed since the decline of religion, the sheer number of victims tortured and murdered by these surrogate atheistic (or perhaps neo-pagan in the case of the Nazis) regimes is as gut-wrenching as it is perplexing.

Few have spoken with more forceful eloquence or erudition on the evils of religion than ardent atheist Christopher Hitchens. Sadly it was this same hatred of religion that in the end led Hitchens to join in the chorus calling for the neo-imperialist ‘war on terror’ and finally arguing the case for the ‘shock and awe’ bombing and subsequent invasion of Iraq at the cost of more than a million innocent lives in a 2003 collection of essays entitled A Long Short War: The Postponed Liberation of Iraq. One of Hitchens’ prime examples of religious authority making good people behave in morally repugnant ways is the barbarous practice of infant genital mutilation:

Britain itself witnessed centuries of religious intolerance, brutal repression and outright thuggery. Henry VIII, one of the most celebrated monsters in history, is chiefly remembered for his penchant for uxoricide, not to mention the land-grabbing and bloodletting of the English Reformation that followed from the convenience of his divorce from Catherine of Aragon. And like father, like daughter: this radical transformation of the sectarian landscape under Henry being partially undone by Bloody Mary’s reign of terror and her ultimately failed restoration of Catholicism (and had she been successful it is doubtful she would be remembered as “Bloody Mary”).

Meantime, the sudden rise and spread of the British and other European empires meant that such commonplace domestic atrocities could, during the next four hundred years, be committed as far afield as Africa, North and South America, India, China, and Australia. All of this facilitated by, and, in turn facilitating and encouraging, the international trade in human slaves. Of course, the European place in world history has been a repeatedly shameful one, but then man’s inhumanity to human has also been legitimised and justified for a hundred other reasons beneath dozens of alternative flags. According to historical records then, human nature is infernally bad, and incurably so.

Cruel, bellicose, sneaky, and selfish; we must plead guilty on all counts. We are perhaps the worst, though differing by degree from our fellow creatures. And here is something genuinely unique: many of us feel disgraced by our own diabolical behaviour. Somehow, we know that there is a better way to use our special talents. But then, what other creature could take such a detached position? Could actually aspire to be kinder, peaceful, and more selfless?

*

The French writer Voltaire is nowadays best remembered for his marvelous satire, Candide (1759), which he subtitled with characteristic irony: “or the Optimist”. A savage critique of the unenlightened politics and obscurantist metaphysics of his time, Candide is an historical fantasy, with many episodes in the book cleverly interwoven with factual events of the period. It is rightly celebrated, and I reference its central theme in the addendum below. A decade earlier, however, Voltaire had road-tested similar ideas, choosing not an historical backdrop, but one that we would today describe as science fiction. A forgotten classic, Voltaire’s Micromegas (1750) is a story about the adventures of two philosophical aliens. Here is a brief synopsis.

Micromegas, the eponymous hero, is a gigantic inhabitant of the star Sirius, who ventures to Earth, stopping off at Saturn along the way. Being many miles tall, the Saturnians who are themselves as tall as small hills, nevertheless appear to Micromegas as pigmies, and so his initial response is to deride them: “accustomed as he was at the sight of novelties, he could not for his life repress that supercilious and conceited smile which often escapes the wisest philosopher, when he [first] perceived the smallness of that globe, and the diminutive size of the inhabitants”. Eventually, however, and once the Saturnians ceased to be amazed by his gigantic presence, he befriends the secretary of the Academy of Saturn. Having discussed the comparative differences between their two worlds, Micromegas and the Saturnian resolve to set off on a grand tour of the Solar System. Shortly afterwards they arrive on Earth.

Upon landing, they decide to search around for evidence of intelligence but discover no signs of life at all except, eventually, for a whale, which the Saturnian catches between his fingers and shows to Micromegas, “who laughed heartily at the excessive smallness peculiar to the inhabitants of our globe”. As luck would have it, however, a ship of philosophers happens to be returning from a polar expedition, and aboard this ship, as the aliens soon encounter “a creature very different from the whale”.

Having established contact with the “intelligent atoms” aboard the ship, the alien philosophers are curious to learn about a life so “unencumbered with matter, and, to all appearance, little else than soul” conjecturing that such tiny earthlings must spend their lives “in the delights of love and reflection, which are the true enjoyments of the perfect spirit”. Of course, they are very quickly disabused of such idealist illusions by those on-board:

“We have matter enough,” said [one of the philosophers], “to do abundance of mischief, if mischief comes of matter; and too much understanding, if evil flows from understanding. You must know, for example, that at this very moment, while I am speaking, there are one hundred thousand animals of our own species, covered in hats, slaying an equal number of fellow-creatures who wear turbans; or else are slain by them; and this hath been nearly the case all over the earth from time immemorial…”

“The dispute is about a mud-heap, no bigger than your heal,” continued the philosopher. “It is not that any one of those millions who cut one another’s throats pretends to have the least claim to that clod; the question is to know, whether it shall belong to a certain person who is known by the name of Sultan, or to another whom (for what reason I know not) they dignify with the appellation Caesar. Neither the one nor the other has ever seen, or ever will see, the pitiful corner in question; and scarcely one of those wretches who slay one another hath ever beheld the animal on whose account they are mutually slain!”

Sadly, little has changed since Voltaire wrote his story more than two hundred and fifty years ago. 46

*

But now a related question: why did the Europe become such a dominant force in the first place? This, arguably, is the greatest, most important question in all of our History, though one that until contemporary times was met with the most hubristic of lame answers:

The white race is the most versatile, has the most initiative, a greater facility for organization, and a more practical outlook in life. This has led to its mastery of the material side of living, urged it to invention and discovery, and to the development of industry, commerce and science.

So begins an explication outlined under an horrifically racist heading “why is the white race dominant?” as quoted from a pre-war children’s ‘book of facts’ entitled How Much do You Know?; a copy of which I happen to own. The author’s deep-seated yet unconscious white supremacist mindset presumes such an excruciating air of colonial haughtiness, that immediately after the book summaries the other “races” as follows:

The black race, enervated by the heat of the tropics, has never shown great capacity for sustained or combined effort. The brown race, also found in hot climates, has produced the world’s main religions, and is excelled in artistic handicrafts. The yellow race is said still to have a slave mentality: the individual matters nothing, the community all. 47

When I showed this passage to my father he was rightly outraged. Those opinions were outdated and unacceptable when I was at school, he told me. But then my father went to school a full decade after the book’s publication. A world war had since been and gone. Perceptions and attitudes had evidently changed – greatly for the better.

And yet, if we hold our nose to the overwhelming stench of casual racism, there is within the same passage, one idea that might – if expressed more sensitively – resonate with a somewhat permissible and rather commonly held opinion that still abounds today:

It [the white race – Europeans] has had the advantage also of living for the most part in temperate climates, where the struggle for existence has been neither too difficult nor too easy.

In a sense, it was this very assumption that Jared Diamond attempted not so much to dispel, as to correct in his best-selling book, Guns, Germs and Steel. In pursuit of that end, he dedicated thirty years of life on the road, trying to understand precisely why Europe did come to dominate the world, and he makes the intriguing and largely convincing case that the roots to present global inequality were basically an outcome of freak circumstances and happenstance. Not simply “the advantage also of living for the most part in temperate climates”, although, according to Diamond at least, climate has had a vital part to play in the ascent of the West, but also due to other advantages conferred by location and historical timing.

His book begins by reminding us how the very origins of human civilisation in the Fertile Crescent of the Middle East depended upon the accidental occurrence of arable crops and animals suitable for domestication. These two factors opened the way to a land of plenty. For given that the rise of agriculture was inevitable, Diamond says, then since its origins so happened to occupy a central geographical location in the Eurasian landmass, which has the fortuitous geographical orientation in so much as this super-continent spreads out east and west, thus providing similar lengths of day, and of seasons and climates, then it was comparatively easy for these new modes of agriculture to propagate as the people slowly migrated. A led to B led to C if only because the rise of A, B and C was so perfectly compatible.

Thanks to the development of agriculture, the population enjoyed a surplus, and this in turn brought about the rise of trade, and no less importantly, of free-time. So the people in the new settlements would spend extended periods preoccupied with otherwise unproductive activities, such as making stylistic improvements to their houses and other amenities, rather than, as in former times, gathering nuts or trapping pigs. This new freedom resulted in the rise of new technologies which, with time to spare, could also then be refined – undoubtedly the most significant of which was the production of metals and development of metal-working skills. Plough shears that were later turned into swords.

Trade routes lead to the transmission of new ideas, and once the discovery of gunpowder in China reached the shores of the Middle East, then its military use was quickly perfected. It was thanks to the early invention of writing – which arose on a very few occasions worldwide, and just once outside of the super-continent of Eurasia with the development of Mayan Script in Mexico – that this steady transmission of ideas and innovations thereafter accelerated.

As a consequence, the Eurasian civilisations had everything in place to begin their takeover, and also a secret weapon in reserve which they weren’t even aware of – germs. Our 10,000 years of domestication of so many species had inadvertently equipped these Eurasian invaders with an arsenal of new biological agents: diseases they themselves had considerable immunity to: smallpox from cattle, chicken-pox and influenza from poultry, to name but three examples. Whereas in North and South America, many people did not live in such close proximity to domesticated animals, and so had neither immunity nor exotic infections of their own to spread. Conquests by war were thus very often followed by pandemics more devastating than even our swords and cannons – although more recently, once the genocidal effect of disease had been better understood, the contamination of Native Americans became chillingly deliberate. The rest is history… our history.

Following on the vanguard of conquerors and explorers, a variety of enterprising European settlers made land grabs for King and Country, and as the empires grew, so a few European superpowers came to dominance. According to Diamond’s version then, it was by virtue of the happenstance of circumstance, the stars very firmly in our favour, that these new kingdoms of the West were first won and then overrun.

The rise of agriculture, a fluke, and the inventions of the printing press and the gun, lucky but likely consequences, Diamond presents us with a timeline of evidence to show how European dominance had nothing to do with superior intelligence, or, even that less racist presupposition, superior ideology. We would have won with or without the Protestant work-ethic, and with or without the self-righteous and assertive arrogance that often comes with worship of a One True God; a god who permits unlimited belligerence for holy ends.

In reaching this conclusion, however, Diamond is surely being too much the professor of geography, the scientist, and the archaeologist, and not sufficiently the historian, because even his own evidence doesn’t entirely lend support to such an overarching claim. For when it came to Europe’s seizure of Africa, the tables were to some extent turned, the European settlers now highly susceptible to the ravages of tropical disease, and our advantages, including, of course, the superiority of our weaponry, more than ever buttressed by an unshakeable ideology: that pseudo-religio-scientific notion of racial superiority so imprinted on the minds of the colonisers. It is the European mindset that finally retilts the balance. For the natives needed “civilising”, and despite the ever-present dangers of famine and disease, more than enough Europeans were driven by the profit motive and a deep-seated belief in the virtue of “carrying the white man’s burden”.

*

Bruce Parry is an indigenous rights advocate, author, explorer and filmmaker. He has lived with some of the most isolated tribes in the world, learning from how they interact with each other and the planet. After much exploration, one of the things that has truly inspired Bruce is the idea of egalitarian living. In August 2019, Ross Ashcroft, host of RT’s ‘Renegade Inc.’ caught up with him to hear his ideas on how to we can rethink our leadership structures and muster the courage to look within so we are able to change the modern western narrative:

*

All of the stories we tell fall within two broad categories. First there are our quotidian tales of the everyday. What happened when and to whom. Loosely we might say that all of these are our ‘histories’ whether biographical, personal, anecdotal, or traditional histories that define nations, and where it may be noted the words ‘story’ and ‘history’ are synonymous in many languages. 48 But there are also stories of a second, more fundamental kind: those of fairytale, myth and allegory that sometimes arise as if spontaneously, and though deviating from the strict if mundane ‘truth of accountants’, are able to penetrate and bring to light otherwise occluded insights and wisdom.

Stories of the second kind have sprung forth in all cultures, often sharing common themes and characters. These include stories of creation; of apocalypse; of the wantonness of gods; of murder and revenge; of cosmic love and of battles between superheroes. Interestingly, the songlines of Australian aboriginals map their own stories of origin directly to the land. Less fantastical and wondrous, in the civilised world too, there are nationalistic versions of what might also be more loosely considered ‘songlines’. In England, for instance, we might trace the nation’s genealogy via Stonehenge, Runnymede, Sherwood Forest, Hastings, Agincourt, the white cliffs of Dover and Avalon (today called Glastonbury). Accordingly, Stonehenge tells us we are an ancient people; Runnymede that we are not slaves; Sherwood Forest that we are rebellious and cheer for the underdog; Hastings, Agincourt and the white cliffs of Dover that we are a warrior nation seldom defeated, in part because our isle is all but impregnable; while Avalon, to steal from Shakespeare, makes ours a “blessed plot”:

This royal throne of kings, this sceptred isle,
This earth of Majesty, this seat of Mars,
This other Eden, demi-paradise;
This fortress built by Nature for herself,
Against infection and the hand of war,
This happy breed of men, this little world,
This precious stone set in the silver sea,
Which serves it in the office of a wall,
Or as a moat defensive to a house,
Against the envy of less happier lands;
This blessed plot, this earth, this realm, this England… 49

So here we find history and myth entwined as unavoidably as if they were stories of a single kind. But then what is the past when it is not fully-fleshed and retold in stories? Unlike the rest of the extinct world, it cannot be preserved in jars of formaldehyde and afterwards pinned out on a dissecting table. To paraphrase George Orwell, the stories of our past are not just informed by the present, they are in part reconstituted from it, and thereafter those same stories ineluctably propel us into the future. Not that there is some future already fixed and inescapable, since we have no reason to presume it is, but that what unfolds is already prefigured in our stories, which then guide it like strange attractors, just as today’s world was prefigured by stories told yesterday. If things were otherwise, history would indeed be bunk – nothing more or less than a quaint curiosity. Instead it is an active creator, and all the more dangerous for that. 50

In 1971, Monty Python appeared in an hour-long May Day special showcasing the best of European TV variety. Python’s contribution was a six-minute piece describing traditional May Day celebrations in England, including the magnificent Lowestoft fish-slapping dance [at 2:30 mins]. It also featured as part of BBC2’s “Python Night” broadcast in 1999:

*

IV      Mostly Harmless

“Human nature is not of itself vicious”

— Thomas Paine 51

*

In the eyes of many today, it follows that since our evil acts far exceed our good deeds, and indisputably so given the innumerable massacres, pogroms, genocides and other atrocities that make up so much of our collective history, the verdict on ‘human nature’ is clear and unequivocal. With the evidence piled so precipitously against us as a species, we ought to plead guilty in the hope of leniency. However, and even though at first glance the case does indeed appear an open-and-shut one, this is not a full account of human nature. There is also the better half to being human, although our virtues are undoubtedly harder to appraise than our faults.

Firstly, we must deal with what might be called ‘the calculus of goodness’. I’ve already hinted at this but let me now be more explicit: Whenever a person is kind and considerate, the problem with ‘the calculus’ is how those acts of kindness are to be counted against prior acts of indifference or malevolence? Or to broaden this: how is any number of saints to make up for the actions of so many devils? Can the accumulation of lesser acts of everyday kindness in aggregation, ever fully compensate for a single instance of rape, torture or cold-blooded murder? Or, to raise the same issue on the larger stage again, how did the smallpox and polio vaccines, which undoubtedly saved a great deal of suffering and the lives of millions, compensate against the bombings of Guernica, Coventry, Dresden, Hiroshima and Nagasaki? For aside from the moral dubiousness of all such utilitarian calculations, the reality is that inflicting harm and causing misery is on the whole so much easier than manufacturing any equivalence of good.

And this imbalance is partly an unfortunate fact of life; a fact that new technologies can and will only exacerbate. So here is a terrible problem that the universe has foisted upon us. For destruction is, as a rule, always a much more likely outcome than creation. It happens all of the time. As things erode, decay, go wonky and simply give up the ghost. If you drop a vase onto a hard floor, then your vase will reliably shatter into a pile of shards, and yet, if you toss those same hundred shards back into the air they will never reform into a vase again. Or, as Creationists like to point out (entirely missing the bigger point that evolution is not a purely random process) no hurricane could ever blow the parts from a scrapyard together again to reform a Jumbo Jet. Destruction then – i.e., the turning of order into chaos – turns out to be the way our universe prefers to unwind. And it’s tough to fight against this.

The random forces of extreme weather, earthquakes, and fires, are inherently destructive, just because they are erratic and haphazard. So if destruction is our wish, the universe bends rather easily to our will; and this is the diabolical asymmetry underlying the human condition.

In short, it will always be far easier to kill a man than to raise a child to become a man. Killing requires nothing else than the sudden slash of a blade, or the momentary pull on a trigger; the sheer randomness of the bullet’s tumbling wound being more than enough to destroy life. As technology advances, the push of a button increases that same potentiality and enables us to flatten entire cities, nations, civilisations. Today we enjoy the means for mega-destruction, and what was unimaginable in Voltaire’s day becomes another option forever “on the table”, in part, as I say, because destruction is an easy opinion, comparatively speaking – comparative to creation, that is.

Nevertheless, our modern weapons of mass destruction have all been willfully conceived, and at great expense in terms both of time and resources, when we might instead have chosen to put such time and resources to a wholly profitable use, protecting ourselves from the hazards of nature, or else thoroughly ridding the world of hunger and disease, or by more generally helping to redress the natural though diabolical asymmetry of life. 52

Here then is a partial explanation for malevolent excesses of human behaviour, although I concede, an ultimately unsatisfactory one. For however easily we are enabled to harm others with soft bodies given that we live in such a world beset by sharp objects and less visible perils, we do nevertheless have the freedom to choose not to do so. To live and let live and to commit ourselves to the Golden Rule that we “do unto others as we would have others do unto us”. So my principle objection to any wholesale condemnation of our species will have little to do with the estranging and intractable universal laws of nature, however harshly those laws may punish our human condition; instead, it entails a defence founded on anthropocentric considerations.

For if human nature is indeed so fundamentally rotten, then what ought we to make of our indisputable virtues? Of friendship and love; to select a pair of shining examples. And what of the great social reformers and the peacemakers like Gandhi and Martin Luther King? What too of our most beautiful constructions in poetry, art and music? Just what are we to make of this better half to our human nature? And why did human beings formulate the Golden Rule in the first instance?

Of course, even apparent acts of generosity and kindness can, and frequently do have, unspoken selfish motivations, so the most cynical adherents of the ‘dark soul hypothesis’ go further again, reaching the conclusion that all human action is either directly or indirectly self-serving. That friendship, love, poetry and music, along with every act of philanthropy (which literally means “love of man”), are all in one way or another products of the same innate selfishness. According to such surprisingly widespread opinion, even at our finest and most gallant the underlying motivation is always reducible to “you scratch my back…”

Needless to say, all of human behaviour really can, if we choose, be costed in such a one-dimensional utilitarian terms. Every action evaluated on the basis of outcomes and measured in terms of personal gain, whether actual or perceived. Indeed, given the mountains of irrefutable evidence that people are all-too-often greedy, shallow, petty-minded and cruel, it is not irrational to believe that humans are invariably and unalterably out for themselves. It follows that kindness only ever is selfishness dressed up in mischievous disguise, and challenging such cynicism is far from easy and can feel like shouting over a gale. The abrupt answer here is that not all personal gain ought to be judged equivalently. Since even if our every whim were, in some ultimate sense, inseparable from, contingent upon, and determined by self-interest, then who is this “self” in which our interests are so heavily vested?

Does the interest of the self include the wants and needs of our family and friends, or even, in special circumstances, the needs of complete strangers, and if so, then do we still call it ‘selfish’? If we love only because it means we receive love in return, or for the love of God (whatever this means), or simply for the pleasure of loving, and if in every case this is deemed selfish, then by definition all acts have become selfish. The meaning of selfishness is thus reduced to nothing more than “done for the self”, which misses the point entirely that selfishness implies a deficiency in the consideration of others. Thus, if we claim that all human action is born of selfishness, as some do, we basically redefine and reduce the meaning of ‘selfish’.

Having said this, I certainly do not wish, however tempting it may be, to paint a false smile where the mouth is secretly snarling. There is nothing to be usefully gained by naivety or sentimentality when it comes to gauging estimates of human nature. Nonetheless, there is an important reason to make a case in defence of our species, even if our defence must be limited to a few special cases. For if there is nothing at all defensible about ‘human nature’ it is hard to see past a paradox, which goes as follows: if human beings are innately and thus irredeemably bad (in accordance with our own estimation obviously), then how can our societies, with structures that are unavoidably and unalterably human, be anywise superior to the ‘human nature’ that designs them, and thus inherently and unalterably bad also. After all, ex nihilo nihil fit – nothing comes from nothing. This is, if you like, the Hobbesian Paradox. (And I shall return to it shortly.)

*

There have been many occasions when writing this book has felt to me a little like feeling around in the dark. Just what is it that I am so urgently trying to say? That feeling has never been more pronounced than when working on this chapter and the one ensuing. For human nature is a subject that leads into ever more divergent avenues and into deeper and finer complexities. What does it even mean to delve into questions about ‘human nature’? Already this presumes some general innate propensity that exists and provides a common explanation for all human behaviour. But immediately, this apparently simple issue brings forth a shifting maze of complications.

Firstly, there is the vital but unresolved debate over free will as opposed to determinism, which at one level is the oldest and most impenetrable of all philosophical problems. All attempts to address this must already presuppose sound concepts of the nature of Nature and of being. However, once we step down to the next level, as we must, we find no certain answers are provided by our physical sciences, which basically posit determinism from the outset in order to proceed.

Then there is a related issue of whether as biological organisms, humans are predominantly shaped by ‘nature or nurture’. In fact, it has become increasingly clear that the question itself is subtly altering, since it becomes evident that the dichotomy is a false one. What can be said with certainty is that inherited traits are encouraged, amplified, altered and sometimes prohibited by virtue of our environment due to processes occurring both at biological and social levels. Beyond this, nature and nurture cannot be so easily disentangled.

The tree grows and develops in accordance not merely with biochemical instructions encoded within its seed but in response to the place where that seed germinates, whether under full sunlight or deep shade, whether its roots penetrate rich or impoverished soil, and in accordance with temporal variations in wind and rainfall. We too are shaped not only as the flukes of genealogy, but by adapting moment by moment to environmental changes from the very instant our father’s sperm penetrated and merged with our mother’s egg. We are no more reducible to Dawkins’ ‘lumbering robots’, those vehicles “blindly programmed to preserve the selfish molecules known as genes” 53 that bloodlessly echo Hobbes, than we are to the ‘tabula rasa’ of Aristotle, Locke, Rousseau and Sartre. Yet somehow this argument lurches on, at least in the public consciousness, always demanding some kind of binary answer as though this remains a possibility.

As for the question of free will or determinism at a cosmic level, my personal belief is the one already presented in the book’s introduction, although to make matters absolutely unequivocal allow me to proffer my equivalent to Pascal’s famous wager: that one ought to live without hesitation as though free will exists, because in the case you are right, you gain everything, whereas if you lose, you lose nothing. Moreover, the view that we are without agency and altogether incapable of shaping our future involves a shallow pretence that also seeks to deny personal responsibility; it robs us of our dignity and self-respect, and disowns the god that dwells within.

As for proof of this faculty, I have none, and the best supporting evidence is that on occasions when I have most compellingly perceived myself as a thoroughly free agent in the world, there has spontaneously arisen a corresponding anxiety: the sense that given one’s possession of such an extravagant gift involves the acknowledgment of the sheer enormity of one’s responsibility. An overwhelming feeling that freedom comes with an excessively heavy price attached.

Indeed, my preferred interpretation of the myth of Eve’s temptation in the Garden of Eden follows from this: that the eating of “the apple” – i.e., the fruit of the tree of the knowledge of good and evil – miraculously and instantly gave birth to free will and conscience as one, with each sustaining the other (like the other snake, Ouroboros, perpetually eating its own tail). It follows that The Fall is nothing besides our human awakening to the contradistinction of good and evil actions, and thus interpreted, this apprehension of morality is simply the contingent upshot of becoming free in a fully conscious sense. 54

Indeed, we might justifiably wonder upon what grounds the most dismal critiques of human nature are founded, if not for the prior existence of a full awareness of moral failings that is itself another component aspect and expression of that same nature. Or, as French writer La Rochefoucauld put it in one of his most famous and eloquent maxims: “Hypocrisy is the homage which vice renders to virtue.” 55 That is, whenever the hypocrite says one thing then does another, he does it because he recognises his own iniquity but then feigns a moral conscience to hide his shame. Less succinctly, it might be restated that acting with good conscience is hard-wired and for most people (sociopaths presumably excluded) doing otherwise automatically involves us in compensatory acts of dissemblance, denial and in self-delusion also.

We have no reason to say humans are wholly exceptional in possessing a conscience, of course, although it seems that we are uncommonly sensitive when it comes to detecting injustice, and the reason is perhaps because (admittedly, this a hunch) we are uniquely gifted empathisers. Unfortunately, such prodigious talent for getting into the minds of others is one that also makes our species uniquely dangerous.

James Hillman was an American psychologist, who studied at, and then guided studies for, the C.G. Jung Institute in Zurich. In the following interview he speaks about how we have lost our connection to the cosmos and consequently our feelings for the beauty in the world and with it our love for life:

*

The Enlightenment struck many blows, one of which effectively killed God (or at least certain kinds of Theism). In the process, it more inadvertently toppled the pedestal upon which humanity had earlier placed itself, as Darwinianism slowly but inevitably brought us all back down to earth with a bump. No longer the lords of creations, still the shibboleth of anthropocentrism is much harder to shake.

Hobbes convinced us that ‘human nature’ is dangerous because it is Nature. Rousseau then took the opposing view arguing that our real problems actually stem from not behaving naturally enough. His famous declaration that “Man is born free, and everywhere he is in chains” forms the opening sentence of his seminal work The Social Contract; the spark that had helped to ignite revolutions across Europe. 56 Less than a century later, Marx and Engels concluded The Communist Manifesto, echoing Rousseau with the no less famous imperative often paraphrased: “Workers of the world unite! You have nothing to lose but your chains” 57

In the place of freedom and perhaps out of a desperate sense of loss, we soon recreated ourselves as gods instead and then set about constructing new pedestals based on fascist and Soviet designs. But finally, the truth was out. Humans make terrible gods. And as we tore down the past, remembering in horror the death camps and the gulags, we also invented new stories about ourselves.

In the process, the post-Hobbesian myth of ‘human nature’ took another stride. Rather than being on a level with the rest of creation and mechanically compelled to lust for power and material sustenance like all animals, our species was recast once again as sui generis in a different way. Beyond the ability to wield tools, and to manipulate the world through language and indeed by virtue of culture more generally, we came to the conclusion that the one truly exceptional feature of humans – the really big thing that differentiates ‘human nature’ from the whole of the rest of nature – was our species outstanding tendency to be rapacious and cruel. Thanks to our peculiar desire for self-aggrandisement, this has become the latest way we flatter ourselves.

It is sometimes said, for instance, that humans are the only creatures that take amusement from in cruelty. Indeed, at first glance this sounds like a perfectly fair accusation, but then just a little consideration finds it to be false. Take the example of the well-fed cat that is stalking the bird: does it not find amusement of a feline kind in its hunt?  When it toys with a cornered mouse, meting out a slow death from the multiple blows of its retractable claws, is it not enjoying itself? And what other reason can explain why that killer whales will often toss a baby seal from mouth to mouth – shouldn’t they just put it out of its misery?

Ah yes, comes the rejoinder, but still we are the only creatures to engage in full-scale warfare. Well, again, yes and no. The social insects go to war too. Chemical weapons are deployed as one colony defends itself from the raids of an aggressor. When this is granted, here’s the next comeback: ah, but we bring malice aforethought. The social insects are merely acting in response to chemical stimuli. They have pheromones for war, but no savage intent.

This brings us a little closer to home – too close perhaps – since it is well documented that chimpanzees gang up to fight against a rival neighbouring troop. How is this to be differentiated from our own outbreaks of tribal and sectarian violence?

That chimpanzees are capable of malice aforethought has long been known too. Indeed, they have observed on occasions to bring a weapon to the scene of the attack. But then, you might expect our immediate evolutionary cousins to share a few of our vices! However, in the 1970s, primatologist Jane Goodall was still more dismayed when she saw how the wild chimps she was studying literally descended into a kind of civil war: systematically killing a group of ‘separatists’ one-by-one and apparently planning their campaign in advance. 57a So yes, without any doubt, humans are best able of all creatures to act with malice aforethought, yet even in this we are apparently not alone.

Okay then… and here is the current fashion in humanity’s self-abasement… we are the only creatures that deliberately destroy their own environment. But again, what does this really mean? When rabbits first landed in Australia (admitted introduced by humans), did they settle down for a fair share of what was available? When domestic cats first appeared in New Zealand (and sorry to pick on cats again), did they negotiate terms with the flightless birds? And what of the crown of thorns starfish that devours the coral reefs, or of the voracious Humboldt squid swarming in some parts of our oceans and consuming every living thing in sight? Or consider this: when the continents of North and South America first collided and a land bridge allowed the Old World creatures of the North to encounter the New World creatures of the South, the migration of the former caused mass extinction of the latter. The Old World creatures being better adapted to the new circumstances simply ate the competition. There was not a man in sight.

In short, Nature’s balance is not maintained thanks to the generosity and co-operation between species: this is a human conceit. Her ways are all-too often cruel. Foxes eat rabbits and in consequence their populations grow and shrink reciprocally. Where there is an abundance of prey the predators thrive, but once numbers reach a critical point that feast becomes a famine, which restores the original balance. This is how ‘Nature’s balance’ is usually maintained – just as Malthus correctly describes (more below). But modern humans have escaped this desperate battle for survival, and by means of clever artificial methods, enable our own populations to avoid both predation and famine; an unprecedented situation that really does finally set us apart from all of our fellow species.

*

When Donald, son of psychologists, Winthrop and Luella Kellogg, turned ten-months old, his parents took the extraordinary decision of adopting Gua, a seven and a half-month female chimp to bring up in their home as a surrogate sibling. It was the 1930s and this would be a pioneering experiment in primate behaviour; a comparative study that caused some deal of dismay in academia and amongst the public. But irrespective of questions of ethics and oblivious to charges of sensationalism, the Kelloggs proceeded and Donald and Gua finally lived together for nine months.

They soon developed a close bond. Although younger, Gua was actually more mature than Donald both intellectually and emotionally. Being protective, she would often hug him to cheer him up. Her development was remarkably swift, and she quickly learned how to eat with a spoon and to drink from a glass. She also learned to walk and to skip – obviously not natural behaviours for a chimp – as well as to comprehend basic words; all of this before Donald had caught up.

This comparative developmental study had to be cut short, however, because by the age of two, Donald’s behaviour was becoming disconcertingly apelike. For one thing, he was regressing back to crawling. He had also learned to carry things in his mouth, picking up crumbs with his lips and one day chewing up a shoe, and far more than ordinary toddlers, he took delight in climbing the furniture and trees. Worse still, his language skills were seriously delayed and by eighteen-months he knew just three words, so that instead of talking he would frequently just grunt or make chimp-like gesticulations instead. The story ends tragically, of course, as all of the concerns over ethics became confirmed. Gua died of pneumonia less than a year after the study was curtailed and she had been abandoned by the Kelloggs family. Donald committed suicide later in life when he was 43 years old.

This is a sad story and by retelling it I am in no way endorsing the treatment of Donald and Gua. No such experiment should ever have been conducted, but it was, and the results are absolutely startling nonetheless. Instead of “humanizing the ape”, as the Kelloggs hoped to achieve, the reverse had been occurring. What they had proved inadvertently is that humans are simply more malleable than chimps, or for that matter any other creature on earth. It is humans that learn best by aping and not the other way around.

*

However much we may try to refine our search for answers, it is actual difficult to get beyond the most rudimentary formulation which ponders upon whether ‘human nature’ is for the most part good or bad. Rephrased, as it often is, this same inquiry generally receives one of four responses that can be summarised as follows: –

i) that human nature is mostly good but corruptible;

ii) that human nature is mostly bad but can be corrected;

iii) that human nature is mostly bad but with flaws that can be ameliorated – rather than made good; or,

iv) most misanthropically, that human nature is atrocious, and irredeemably so, but that’s life.

The first is the Romanticism of Rousseau, whereas the third and fourth hinge around the cynicism of Hobbes. Whereas Hobbes had regarded the ‘state of nature’ as the ultimate threat, Rousseau implores us instead to return to a primitive state of authentic innocence. And it is these extremes of Hobbes and Rousseau that still prevail, informing the nuclear-armed policy of Mutual Assured Destruction on the one hand, and the counterculture of The New Age on the other. Curiously, both peer back distantly to Eden and reassess The Fall from different vantages too. Although deeply unreligious, Hobbes holds the more strictly Christian orthodox view. As undertaker and poet Thomas Lynch laid it out:

[T]he facts of the matter of human nature – we want, we hurt and hunger, we thirst and crave, we weep and laugh, dance and desire more and more and more. We only do these things because we die. We only die because we do these things. The fruit of the tree in the middle of Eden, being forbidden, is sexy and tempting, tasty and fatal.

The fall of Man and Free Market Capitalism, no less the doctrines of Redemptive Suffering and Supply and Demand are based on the notion that enough is never enough… A world of carnal bounty and commercial indifference, where men and women have no private parts, nor shame nor guilt nor fear of death, would never evolve into a place that Darwin and Bill Gates and the Dalai Lama could be proud of. They bit the apple and were banished from it. 58

Forever in the grip of the passions, our ‘appetites’ and ‘aversions’, these conjoined and irrepressible Hobbesian forces of attraction and repulsion continually incite us. In our desperation to escape we flee blindly from our fears, yet remaining hopeful always of entirely satisfying our desires. It’s pain and pleasure all the way: sex and death! And I imagine if you had asked Hobbes whether without the apple “we’d still be blissfully wandering about naked in paradise”, as Dudley Moore put it to Peter Cook’s Devil in the marvelous Faustian spoof Bedazzled, you’d very likely get a similar reply to the one Cook gave him: “they [Adam and Eve] were pig ignorant!” 59

However, the Genesis myth although a short story, in fact takes place as two very distinct acts: and only the first part is concerned with temptation, whereas the denouement is centred on shame. So let’s consider shame for a moment, because shame appears to be unique as an emotion, and though we habitually confuse it with guilt – since both are involved in reactions to conscience – shame has an inescapable social quality. To summarise this, guilt involves what you do, while shame is intrinsically bound up with your sense of self. So guilt leads us to make apologies, a healthy response for wrongdoing, whereas you cannot apologise for being bad.

adam and eve expulsion from eden

Detail from ‘The Expulsion from the Garden of Eden’ (Italian: Cacciata dei progenitori dall’Eden), a fresco by the Italian Early Renaissance artist Masaccio, ca. 1427. Based on image from Wikimedia Commons.

The American academic Brené Brown describes shame as “the intensely painful feeling or experience of believing that we are flawed and therefore unworthy of love and belonging” 60 and says imagine how you would feel if you were in a room with all the people you most loved but when you walked out you began to hear the worst things imaginable about you; so bad that you don’t think you’ll ever be able to walk back into the room to face everyone again.

In fact, shame is ultimately tied up with fears of being unworthy, unloveable, and of abandonment that we learn to feel as infants, when isolation and rejection are actual existential threats. So it triggers instinctual responses that humans probably evolved in order to avoid being rejected and ostracised by the group, when this again involved an actual existential threat. Shame is an overwhelming feeling accompanied by lots of physiological sensations such as blushing, the tightening of the chest, feelings of not being able to breathe, and a horrible doubt that also runs to the pit in your stomach. It is really no exaggeration to say that shame feels like death. While guilt leads us to make apologies, a healthy response for wrongdoing, you cannot apologise for being bad.

Moreover, and unlike our other emotions, shame can be a response to just about anything: our appearance, our own attention-seeking, when we get too boisterous, too over-excited, talking too much (especially about oneself); or when we retreat into isolation, feeling shy and avoidant; or feeling inauthentic, fake; or for being taken advantage of; or conversely being unable to drop our armour, and being judgmental and quick to anger; or just for a lack of ability, skills, or creativity; our failure to communicate properly, including being able to speak up or speak honestly; or when we are lazy, or weak, with low energy or lack of motivation, perhaps sexually; or finally – not that my list is in anyway exhaustive – shame can be triggered by anxiety, nervousness, defensiveness, when we display our weakness by blushing or showing other visual signs of nervousness or shame. Note the circularity.

Strangely, we can even feel shame without recognising the symptoms, and this may again generate escalating confusion and a terrifying sense of spiralling: a fear that we won’t survive the feeling itself. In fact, shame and fear have a co-existent relationship such that we can alternate between both, and both may leave terrible psychological scars; some of parts of us becoming repressed; others forming a mask – becoming conscious and unconscious aspects (a topic I return to consider in the next chapter).

Interestingly, Jean-Paul Sartre is often paraphrased saying “hell is other people”, which is then widely misinterpreted to mean that our relationships with others are invariably poisoned. In fact, what Sartre had meant is closer to the idea that hell is the judgment of our own existence in the eyes of other people, so then again, perhaps what he finally intended to say is “hell is our sense of rejection in the eyes of others”. If so, then he was right? 61

Seen in this way, the Rousseauian standpoint becomes intriguing. Is it possible that the root cause of all human depravity is finally shame? And if we could get beyond our shame, would this return to innocence throw open the gates to paradise once more?

In this chapter I have already tried to expose the chinks in their rather well-worn armour of Hobbesianism, because for the reasons expounded upon above, it has been collectively weighing us down. Hobbes’ adamancy that human nature is rotten to the core with its corollary that there is little that can be done about it, is actually rather difficult to refute; the measure of human cruelty vastly exceeding all real or apparent acts of generosity and kindness. But Hobbes’ account is lacking and what it lacks in abundance is any kind of empathy. Our capacity for empathy is, Brené Brown points out, obstructed primarily by shame. Why? Because empathy can only flourish where there is vulnerability and this is precisely what shame crushes.

So yes, we must concede that the little boy who pulls the legs off flies greatly amuses himself. There can be a thrill to malice, if of a rather shallow and sordid kind. But more happiness is frequently to found in acts of creation than in destruction; more fulfillment to helping than hindering; and there is far more comfort in loving than in hating. Even Hobbes, though ‘twinned with fear’, deep down must have known this.

Brené Brown has spent many decades researching shame, which she believes is an unspoken epidemic and the secret behind many forms of disruptive behaviour. An earlier TED talk on vulnerability became a viral hit. Here she explores what can happen when people confront their shame head-on:

*

On the whole, we are not very much into the essence of things these days. Essentialism is out and various forms of relativism are greatly in vogue. That goes for all things except perhaps our ‘human nature’, for which such an essence is very commonly presumed. Yet it seems to me that the closer one peers, the blurrier any picture of our human nature actually becomes; and the harder one tries to grasp its essence, the less tangible it is. In any case, each of the various philosophies that inform our modern ideas of ‘human nature’ are intrinsically tainted by prior, and in general, hidden assumptions, which arise from vestigial religious and/or political dogma.

For instance, if we take our cue from Science (most especially from Natural History and Biology) by seeking answers in the light of Darwin’s discoveries, then we automatically inherit a view of human nature sketched out by Malthus and Hobbes. Malthus who proceeded directly from (his own version of) God at the outset, and Hobbes, who in desperately trying to circumvent the divine, finished up constructing an entire political philosophy based on a notion barely distinguishable from Augustine’s doctrine of Original Sin. Meanwhile almost all of the histories that commonly inform our opinions about human nature are those written about and for the battle-hardened conquerors of empires.

But why suppose that there really is anything deserving the title ‘human nature’ in the first place, especially given what is most assuredly known about our odd species: that we are supremely adaptable and very much more malleable and less instinctive than all our fellow creatures. Indeed the composite words strike me as rather curious, once I can step back a little. After all, ‘human’ and ‘nature’ are not in general very comfortable bedfellows. ‘Human’ meaning ‘artificial’ and ‘nature’ meaning, well… ‘natural’… and bursting with wholesome goodness. Or else, alternatively, ‘human’ translating as humane and civilised, leaving ‘nature’ to supply synonyms for wild, primitive and untamed… and, by virtue of this, red in tooth and claw.

In short, the very term ‘human nature’ is surely an oxymoron, doubly so as we see above. The falsehood of ‘human nature’ concealing the more fascinating if unsettling truth that in so many respects humans conjure up their nature in accordance with how we believe ourselves to be, which rests in turn on what limits are set by our family, our acquaintances and the wider culture. Human nature and human culture are inextricable, giving birth to one another like the paradoxical chicken and egg. As Huxley writes:

‘Existence is prior to essence.’ Unlike most metaphysical propositions, this slogan of the existentialists can actually be verified. ‘Wolf children,’ adopted by animal mothers and brought up in animal surroundings, have the form of human beings, but are not human. The essence of humanity, it is evident, is not something we are born with; it is something we make or grow into. We learn to speak, we accumulate conceptualized knowledge and pseudo-knowledge, we imitate our elders, we build up fixed patterns of thought and feeling and behaviour, and in the process we become human, we turn into persons. 62

Alternatively, we might give a nod to Aristotle who famously declared “man is by nature a political animal”, an assessment seemingly bound up in contradictions while yet abundantly true, and which he then expounds upon saying:

“And why man is a political animal in a greater measure than any bee or any gregarious animal is clear. For nature, as we declare, does nothing without purpose; and man alone of the animals possesses speech. The mere voice, it is true, can indicate pain and pleasure, and therefore is possessed by the other animals as well (for their nature has been developed so far as to have sensations of what is painful and pleasant and to indicate those sensations to one another), but speech is designed to indicate the advantageous and the harmful, and therefore also the right and the wrong; for it is the special property of man in distinction from the other animals that he alone has perception of good and bad and right and wrong and the other moral qualities, and it is partnership in these things that makes a household and a city-state.” 63

To end, therefore, I propose a secular update to Pascal’s wager, which goes as follows: if, and in direct contradiction to Hobbes, we trust in our ‘human nature’ and promote its more virtuous side, then we stand to gain amply in the circumstance that we are right to do so and at little cost, for if it turns out we were mistaken and ‘human nature’ is indeed intrinsically rotten to our bestial cores, our lot as a species is inescapably dreadful whatever we wish to achieve. For in the long run, as new technologies supply ever more creative potential for cruelty and destruction (including self-annihilation), what chance do we have to survive at all if we are so unwilling to place just a little trust in ourselves to do a whole lot better?

Next chapter…

*

Addendum: the Malthusian population bomb scare

Thomas Malthus was a man of many talents. A student of Cambridge University, where he had excelled in English, Latin, Greek and Mathematics, he later became a Professor of History and Political Economy and a Fellow of the Royal Society. There is, however, chiefly one subject above all others that Malthus remains closely associated with, and that is the subject of demography – human populations – a rather single-minded preoccupation that during his tenure as professor is supposed to have earned him the nickname “Pop” Malthus.

Malthus big idea was precisely this: that whereas human population increases geometrically, food production, upon which the growing population inevitably depends, can only increase in an arithmetic fashion. He outlines his position as follows:

I think I may fairly make two postulata. First, That food is necessary to the existence of man. Secondly, That the passion between the sexes is necessary and will remain nearly in its present state. These two laws, ever since we have had any knowledge of mankind, appear to have been fixed laws of our nature, and, as we have not hitherto seen any alteration in them, we have no right to conclude that they will ever cease to be what they now are… 64

Given that populations always grow exponentially whereas food production must inevitably be arithmetically limited, Malthus concludes that the depressing, but unassailable consequence is a final limit not simply to human population but to human progress and “the perfectibility of the mass of mankind”:

This natural inequality of the two powers of population and of production in the earth, and that great law of our nature which must constantly keep their effects equal, form the great difficulty that to me appears insurmountable in the way to the perfectibility of society. All other arguments are of slight and subordinate consideration in comparison of this. I see no way by which man can escape from the weight of this law which pervades all animated nature. No fancied equality, no agrarian regulations in their utmost extent, could remove the pressure of it even for a single century. And it appears, therefore, to be decisive against the possible existence of a society, all the members of which should live in ease, happiness, and comparative leisure; and feel no anxiety about providing the means of subsistence for themselves and families. 65

It’s a truly grim message, although in fairness to Malthus, the gloom is delivered in a lively and frequently entertaining style. That said, however, Malthus was wrong. Terribly wrong.

Firstly, he was wrong in terms of specifics, since he wildly over-estimated the rate of population growth 66, thereby exaggerating the number of future mouths needing to fed and, by extension, the amount of food needed to fill them. Obviously what Malthus was lacking here was actual available statistics, and it is perhaps not surprising therefore, that he later became one of the founder members of the Statistical Society in London 67: the first organisation in Britain dedicated to the collection and collation of national statistics. Charles Babbage, who is nowadays best remembered as the inventor of early calculating machines, known as “difference engines” – machines that helped to lead the way to modern computing – was another founder member of the group, and obviously took statistics very seriously indeed. He even once corrected the poet Alfred Tennyson in a letter as follows:

In your otherwise beautiful poem, one verse reads, ‘Every moment dies a man,/ Every moment one is born’: I need hardly point out to you that this calculation would tend to keep the sum total of the world’s population in a state of perpetual equipoise whereas it is a well-known fact that the said sum total is constantly on the increase. I would therefore take the liberty of suggesting that in the next edition of your excellent poem the erroneous calculation to which I refer should be corrected as follows: ‘Every moment dies a man / And one and a sixteenth is born.’ I may add that the exact figures are 1.167, but something must, of course, be conceded to the laws of metre. 68

It may be noted then, that such a rate of increase (presumably based on real statistics), although still exponential, is far below the presumed rates of growth in Malthus’s essay. But then Malthus’s estimate may be fairly excused; his famous essay having been first published about four decades before any statistics would have been available. Malthus was, however, also more fundamentally wrong in his thesis; for such catastrophic oscillations as he envisaged through cycles of overpopulation and famine are not the order of our times, and less so now than even during his own times of relatively small populations. In fact contrary to Malthus’ prophesies of doom, we have a great plenty of food to go around (lacking merely the political and economic will to distribute it fairly) 69, with official UN estimates indicating that we shall continue to have such abundance for the foreseeable future. 70

*

I can still recall when, as a sixth-former, I’d first heard about Malthus’ theory of population, and how it had sounded like altogether the daftest, most simplistic theory I’d ever come across – an opinion that remained for at least a few months before I’d heard about Abraham Maslow’s “hierarchy of needs” which I then considered still dafter and more simplistic again. In both cases, it was clear to me that supposition and conjecture is being presented as quasi-scientific fact. In Maslow’s case, with his hierarchical stacking of physical and psychological needs, it was also self-evident that no such ascending pyramid really existed anywhere outside of Maslow’s own imaginings. That you might just as well construct a dodecahedron of pleasures, or a chocolate cheesecake of motivational aspirations, as make-up any kind of pyramid of human needs.

I was judging his ideas unfairly, however, and in hindsight see I was prejudiced by my scientific training. As a student of Physics, Chemistry and Mathematics, I’d become accustomed to rigorously grounded theories in which predictions can and must be made and tested against actual data. But Maslow’s theory is not a theory of this kind. It is inherently nonrigorous, and yet it may still be valuable in another way. As a psychologist he had diverged from the contemporary practice of expanding the field purely on the basis of neuroses and complexes, and he sought instead, a more humanistic approach to analysing what he thought constituted healthy-mindedness. His main concern was how people might achieve “self actualization”. So his ‘theory’ is better understood and judged within this context, and the same goes for other nonrigorous formulations. 71

With Malthus, however, my irritation was coloured differently. His theory may have been simply an educated and carefully considered hunch, but it did at least present us with outcomes that could be scientifically reviewed. Plainly, however, all the available facts confounded his case absolutely.

After all, it had been two centuries since Malthus first conjectured on the imminence of food shortages, yet here we were, hurtling towards the end of the twentieth century, still putting too many leftovers in our bins. And though people living in the third world (as it was then called) were desperately poor and undernourished – as remains the case – this was already the consequence of our adopted modes of distribution rather than any consequence of insufficient production of food as such. Indeed, as a member of the EEC, the United Kingdom was responsible for its part in the storage of vast quantities of food and drink that would never be consumed: the enormous ‘mountains of cheese’ and the ‘lakes of milk and wine’ being such prominent features of the politico-economic landscape of my adolescence.

So where precisely did Malthus go wrong? In fact, both of his purportedly axiomatic postulates are unfounded. Regarding food production being an arithmetic progression, he completely failed to factor in the staggering ingenuity of human beings. He seems curiously oblivious to how, even at the turn of the nineteenth century when his essay was written, food production was already undergoing some dramatic technological shifts, including methods of selective breeding, and with the advent of mechanised farming equipment. The more recent developments of artificial fertilisers and pesticides have enabled cultivation of far greater acreage, with crop yields boosted far in excess of any arithmetic restriction. With the latest “green technologies” permitting genetic manipulation, the amounts of food we are able to produce might be vastly increased again, if this is what we should chose to do – and I do not say that we should automatically resort to such radical and potentially hazardous new technologies, only that there are potential options to forestall our supposed Malthusian fate.

Meanwhile, on the other side of Malthus’s inequality, we see that his estimates of rates of population growth were wrong for different but perhaps related reasons. Again, he underestimates our adaptive capability as a species, but here the error is born out of an underlying presumption; one that brings me right back to the question of ‘human nature’.

*

Perhaps the most interesting and intriguing part of Malthus’ famous essay are not the accounts of his discredited formulas that illustrate the mismatch between population growth and food production, but the concluding pages. Here are chapters not about geometric and arithmetic progressions, nor of selected histories to convince us of the reality of our predicament, nor even of the various criticisms of progressive thinkers who he is at pains to challenge – no, by far the most interesting part (in my humble opinion) are the final chapters where he enters into discussion of his real specialism, which was theology. For Reverend Malthus was first and foremost a man of the cloth, and it turns out that his supposed axiomatic propositions have actually arisen from his thoughts about the nature of God, of Man, of the Mind, and of Matter and Spirit. 72, 73

In short, Malthus argues here that God fills us with needs and wants in order to stimulate action and develop our minds; necessity being such a constant and reliable mother of invention. And Malthus draws support from the enlightenment philosophy of empiricist and humanist John Locke:

If Locke’s idea be just, and there is great reason to think that it is, evil seems to be necessary to create exertion, and exertion seems evidently necessary to create mind.” This given, it must follow, Malthus says, that the hardships of labour required for survival are “necessary to the enjoyment and blessings of life, in order to rouse man into action, and form his mind to reason. 74

Whilst adding further that:

The sorrows and distresses of life form another class of excitements, which seem to be necessary, by a peculiar train of impressions, to soften and humanize the heart, to awaken social sympathy, to generate all the Christian virtues, and to afford scope for the ample exertion of benevolence.

The perennial theological “problem of evil” is thus surmountable, Malthus says, if one accepts “the infinite variety of forms and operations of nature”, since “evil exists in the world not to create despair, but activity.” In other words, these things are sent to try us, or rather, because Malthus is very keen to distance himself from more traditional Christian notions of reward and punishment, “not for the trial, but for the creation and formation of mind”. Without pain and distress there would be no pricks to kick against, and thus no cause to perfect ourselves. This, at least, is Malthus’ contention.

In this he echoes a theodicy already well developed by one of the true Enlightenment geniuses, Gottfried Wilhelm Leibniz. Best remembered now as the independent discoverer of calculus, unaware of Newton’s parallel development, Leibniz also left us an astonishing intellectual legacy with published articles on almost every subject including politics, law, history and philosophy. In a collection of essays from 1710, and in making his own case for the goodness of God, it was Leibniz who first described our world as “the best of all possible worlds”. 75

Famously, Voltaire stole Leibniz’s aphorism and, by reworking it into the central motif of his marvellous satire Candide (written 1759), invested it with characteristically biting irony. In Candide’s adventures, Voltaire turns the phrase into the favourite maxim and motto of his learned companion and teacher Dr Pangloss. The Panglossian faith an unimpeachable acceptance of the divine and cosmic beneficence to be maintained in spite of every horror and irrespective of all disasters they witness and that befall them. Shipwrecks, summary executions, and even being tortured by the Inquistion; all is justifiable in this best of all possible worlds. For Malthus, although writing half a decade after Voltaire’s no-nonsense lampooning, an underpinning belief in a world that was indeed “the best of all possible worlds” remained central to his thesis; Malthus even declaring with Panglossian optimism that:

… we have every reason to think that there is no more evil in the world than what is absolutely necessary as one of the ingredients in the mighty process [of Life]. 76

So what does all of this mean for Malthus’s God? Well, God is mysterious and ultimately unfathomable, because “infinite power is so vast and incomprehensible an idea that the mind of man must necessarily be bewildered in the contemplation of it.” This accepted, Malthus then argues that we do have clues, however, for understanding God through objective analysis of his handiwork, by “reason[ing] from nature up to nature’s God and not presum[ing] to reason from God to nature.”

Yes, says Malthus, we might fancy up “myriads and myriads of existences, all free from pain and imperfection, all eminent in goodness and wisdom, all capable of the highest enjoyments, and unnumbered as the points throughout infinite space”, but these are “crude and puerile conceptions” born of the inevitable and unassailable ignorance and bewilderment we have before God. Far better then, to:

“… turn our eyes to the book of nature, where alone we can read God as he is, [to] see a constant succession of sentient beings, rising apparently from so many specks of matter, going through a long and sometimes painful process in this world, but many of them attaining, ere the termination of it, such high qualities and powers as seem to indicate their fitness for some superior state. Ought we not then to correct our crude and puerile ideas of infinite Power from the contemplation of what we actually see existing? Can we judge of the Creator but from his creation?”

So God, at least according to Rev. Malthus, is to be understood directly through Nature – an idea that is bordering on the heretical. But what of the Principle of Population? How does this actually follow from the Malthusian “God of nature” 77 ?

Here we must remind ourselves again that what nowadays are sometimes called our instinctual drives, and what Malthus describes as “those stimulants to exertion which arise from the wants of the body”, are to Malthus but necessary evils. They are evils but with a divine purpose, and this purpose alone justifies their existence. In particular, those wants of the body which Malthus coyly refers to as “the passion between the sexes” are, in this scheme, the necessary means for the human race to perpetuate itself. With sex directly equated to procreation.

On the face of it then, Malthus must have been entirely ignorant of the sorts of sexual practices that can never issue progeny. (To rework a line from Henry Ford) sex might be any flavour you like, so long as it is vanilla! More likely, however, he dismissed any such ‘contraceptive’ options not because of ignorance but on the grounds of his deep-seated Christian morality. Rum and the lash, in moderation possibly, but sodomy… we are British!

If Malthus could be brought forward to see the western world today, what he’d find would doubtless be a tremendous shock in many ways. Most surprisingly, however, he would discover a culture where ‘the passions’ are endlessly titillated and aroused, and where “the wants of the body” are very easily gratified. Quite aside from the full-frontal culture shock, Malthus would surely be even more astonished to hear that our libidinous western societies have solved his supposedly insoluble population problem; our demographics flattening off, and our numbers in a slow but annual decline.

Malthus had argued very strongly against the poor laws, calling for their eventual abolition. He firmly believed that all kinds of direct intervention only encouraged a lack of moral restraint which was the underlying root to all the problems. He earnestly believed that it would be better to let nature take care of these kinds of social diseases. Yet we can now see that one solution to his population problem has been the very thing he was fighting against. That the populations in our modern societies have stabilised precisely because of our universal social welfare and pension systems: safety nets that freed us all from total reliance upon the support of our children in old age.

We also see that as child mortality has markedly decreased, parents have little reason to raise such large families in the first instance. And that once more people – women especially – won access to a basic education, the personal freedom this affords gave them further opportunity and better reason to plan ahead and settle for smaller families. It is thanks to all of these social changes, combined with the development of the contraceptive pill, that “the passion between the sexes” has been more or less surgically detached from population growth.

Making life tougher, Malthus reasoned, would be the bluntest tool for keeping down the numbers, especially of the lower classes. Yet if he landed on Earth today, he would discover irrefutable proof that the exact opposite is the case. That where nations are poorest, populations are rising fastest. There is much that Malthus presumed to be common sense but that, in fact, turns out to be false. 78

*

Please note that for the purposes of ‘publishing’ here I have taken advantage of the option to incorporate hypertext links and embed videos – in order to distinguish additional commentary from the original text all newly incorporated text has been italised.

*

1 From Prince Hamlet’s monologue to Rosencrantz and Guildenstern in Hamlet Act II, Scene 2. In fuller context:

What a piece of work is a man! How noble in reason, how infinite in faculty! In form and moving how express and admirable! In action how like an angel, in apprehension how like a god! The beauty of the world. The paragon of animals. And yet, to me, what is this quintessence of dust? Man delights not me. No, nor woman neither, though by your smiling you seem to say so.

2  Quote taken from the Introduction to The Naked Ape written by Desmond Morris, published in 1967; Republished in: “The Naked Ape by Desmond Morris,” LIFE, Vol. 63, Nr. 25 (22 Dec. 1967), p. 95.

3 Stanley Kubrick speaking in an interview with Eric Norden for Playboy (September 1968)

4 “It takes all the running you can do, to keep in the same place.”

5 The original script for the 2001 also had an accompanying narration which reads:

“By the year 2001, overpopulation has replaced the problem of starvation but this is ominously offset by the absolute and utter perfection of the weapon.”

“Hundreds of giant bombs had been placed in perpetual orbit above the Earth. They were capable of incinerating the entire earth’s surface from an altitude of 100 miles.”

“Matters were further complicated by the presence of twenty-seven nations in the nuclear club.”

6 From the Stanley Kubrick interview with Playboy magazine (1968). http://dpk.io/kubrick

7 From the chapter on “Generation” from Zoonomia; or the Laws of Organic Life (1994) written by Erasmus Darwin http://www.gutenberg.org/files/15707/15707-h/15707-h.htm#sect_XXXIX

8

In October 1838, that is, fifteen months after I had begun my systematic inquiry, I happened to read for amusement Malthus On Population, and being well prepared to appreciate the struggle for existence which everywhere goes on from long-continued observation of the habits of animals and plants, it at once struck me that under these circumstances favourable variations would tend to be preserved, and unfavourable ones to be destroyed. The results of this would be the formation of a new species. Here, then I had at last got a theory by which to work; but I was so anxious to avoid prejudice, that I determined not for some time to write even the briefest sketch of it.

From Charles Darwin’s autobiography (1876), pp34–35

9 Bellum omnium contra omnes, a Latin phrase meaning “the war of all against all”, is the description that Thomas Hobbes gives to human existence existing in “the state of nature” that he describes in first in De Cive (1642) and  later in Leviathan (1651). The Latin phrase occurs in De Cive:

“… ostendo primo conditionem hominum extra societatem civilem, quam conditionem appellare liceat statum naturæ, aliam non esse quam bellum omnium contra omnes; atque in eo bello jus esse omnibus in omnia.”

“I demonstrate, in the first place, that the state of men without civil society (which state we may properly call the state of nature) is nothing else but a mere war of all against all; and in that war all men have equal right unto all things.”

In chapter XIII of Leviathan, Hobbes more famously expressly the same concept with these words:

Hereby it is manifest that during the time men live without a common Power to keep them all in awe, they are in that condition which is called War; and such a war as is of every man against every man.[…] In such condition there is no place for Industry, because the fruit thereof is uncertain: and consequently no Culture of the Earth; no Navigation, nor use of the commodities that may be imported by Sea; no commodious Building; no Instruments of moving and removing such things as require much force; no Knowledge of the face of the Earth; no account of Time; no Arts; no Letters; no Society; and which is worst of all, continual Fear, and danger of violent death; And the life of man solitary, poor, nasty, brutish, and short.

10 The glee with which my old professor had jokingly dismissed Galileo was undisguised, and he was quick to add that he regarded Galileo’s reputation as greatly inflated. What other physicist, he inquired of us, is remembered only by their first name? With hindsight, I can’t help wondering to what he was alluding? It is mostly kings and saints (and the convergent category of popes) who we find on first-name historical terms. The implication seems to be that Galileo has been canonised as our first secular saint (after Leonardo presumably). Interestingly, and in support of this contention, Galileo’s thumb and middle fingers plus the tooth and a vertebra (removed from his corpse by admirers during the 18th century) have recently been put on display as relics in the Galileo Museum in Florence.

11 Alexander Pope (1688–1744): ‘Epitaph: Intended for Sir Isaac Newton’ (1730)

12 The famous quote comes from letter Newton sent to fellow scientist Robert Hooke, in which about two-thirds of the way down on the first page he says “if I have seen further, it is by standing on the shoulders of giants.” It has been suggested that this remark was actually intended as a snide dig at Hooke, a rival who Newton was continually in dispute with and who was known for being rather short in physical stature.

13 From Il Saggiatore (1623) by Galileo Galilei. In the original Italian the same passage reads:

La filosofia è scritta in questo grandissimo libro, che continuamente ci sta aperto innanzi agli occhi (io dico l’Universo), ma non si può intendere, se prima non il sapere a intender la lingua, e conoscer i caratteri ne quali è scritto. Egli è scritto in lingua matematica, e i caratteri son triangoli, cerchi ed altre figure geometriche, senza i quali mezzi è impossibile intenderne umanamente parola; senza questi è un aggirarsi vanamente per un oscuro labirinto

14

Hobbes and the earl of Devonshire journeyed to Italy late in 1635, remaining in Italy until the spring of 1636 when they made their way back to Paris. During this tour of Italy Hobbes met Galileo, although the dates and details of the meeting are not altogether clear. In a letter to Fulgenzio Micanzio from 1 December, 1635, Galileo reports that “I have had many visits by persons from beyond the alps in the last few days, among them an English Lord who tells me that my unfortunate Dialogueis to be translated into that language, something that can only be considered to my advantage.” The “English Lord” is almost certainly Devonshire, and the projected English translation of the Dialogue is presumably the work of Dr. Joseph Webb mentioned in Hobbes’s February, 1634 letter to Newcastle. It is therefore likely that Hobbes met Galileo in December of 1635, al-though Hobbes was not otherwise known to be in Florence until April of 1636. Aubrey reports that while in Florence Hobbes “contracted a friend-ship with the famous Galileo Galileo, whom he extremely venerated and magnified; and not only as he was a prodigious witt, but for his sweetness of nature and manners”. Legend even has it that a conversation with Galileo in 1635 or 36 inspired Hobbes to pursue the goal of presenting moral and political philosophy in a rigorously geometrical method, although the evidence here is hardly compelling.

From a paper entitled Galileo, Hobbes, and the Book of Nature by Douglas M. Jesseph, published in Perspectives on Science (2004), vol. 12, no. 2 by The Massachusetts Institute of Technology. It is footnoted with the following disqualifier:

The evidence, such as it is, comes from the eighteenth century historian of mathematics Abraham Kästner, who reported “John Albert de Soria, former teacher at the university in Pisa, assures us it is known through oral tradition that when they walked togeteher at the grand-ducal summer palace Poggio Imperiale, Galileo gave Hobbes the first idea of bringing moral philosophy to mathematical certainty by treating it according to the geometrical method”. Schumann dismisses the tale as “certainly false,” basing this judgment on a variety of evidence, including the fact that Soria himself expressed skepticism about the story.

https://watermark.silverchair.com/106361404323119871.pdf?token=AQECAHi208BE49Ooan9kkhW_Ercy7Dm3ZL_9Cf3qfKAc485ysgAAAo4wggKKBgkqhkiG9w0BBwagggJ7MIICdwIBADCCAnAGCSqGSIb3DQEHATAeBglghkgBZQMEAS4wEQQMsyC-rL3cNaA4jxGKAgEQgIICQUv8KppqEobaooIWAp4zOmspRnjPLemQuJPq9SdYMkXz9MdidZukWj-XPLej4xmVxFg9w13iQjQ6vJBkevCQSAHpI7Ltsdr5Cq9OtusB7kZ72Z2ERWX8aW6-6nXgo5VX3pcUKwR8rfd6uRrDRlT-av27Qg3Gr2yE5bitEnOuljPtwnYeI9ZAAwbu6d9Ncg7_W5_StRVBELTJ7QTjzjsM9Dx0B64IGa9o0L7hTPdc6PkOUK23g6D4dCZ2kN2Qn3fEh-Uwkkm_iYO2DrOqUQM_dkkcjpRGJDrSvUrMpOSpVBPh7V2vz8TzaE_8D3300Zm_f8pkiNKBrqPJ1ghe3b7VmfPj9-foW_4rZCNN2SkcosyNg1988UWm155UoesLrh8NZUm3sxgVnkPafBIx7xmHGdcVmxpQHCH-8Ahju5_VvOx-LfSCbkdc1zFG0Qs-jH4ecrL9ESPQGDhRCUwjtnsCuuC8gjM6UFXl9Fd8bzrdTvVukzlOYEleSlWc-mStmEsiGZ85dPSCKMrv3-jYiXk6k5JvtoFQvYquvcN_krLTYLw0tjzlO-b_0zvRzWWVQnrnjNDkkLWFCAKkDqAIK8OhLfafzHfXenkgvjhcV4Ba1XWp0a5Ji8THKrPO1S3Sa65xm_jgTmlPVVJ69Ar2GWAFBveO6DLy79G6KRKFtE-K9908bmblJzHAUqkI1btDuOIcXCbZy2tFnDj1Dk3lcSuUtJrVeUCsGCFynA8AiN16CTvKUZx3XJvdzv6XGyfE-5n_BE0

15

There be in Animals, two sorts of Motions peculiar to them: One called Vital; begun in generation, and continued without interruption through their whole life; such as are the Course of the Blood, the Pulse, the Breathing, the Concoctions, Nutrition, Excretion, &c; to which Motions there needs no help of Imagination: The other in Animal Motion, otherwise called Voluntary Motion; as to Go, to Speak, to Move any of our limbs, in such manner as is first fancied in our minds. That Sense is Motion in the organs and interior parts of man’s body, caused by the action of the things we See, Hear, &c

Quote from, Leviathan (1651), The First Part, Chapter 6, by Thomas Hobbes (with italics and punctuation as in the original but modern spelling). https://www.gutenberg.org/files/3207/3207-h/3207-h.htm#link2H_PART1

16

[A]lthough unstudied men, do not conceive any motion at all to be there, where the thing moved is invisible; or the space it is moved in, is (for the shortness of it) insensible; yet that doth not hinder, but that such Motions are. For let a space be never so little, that which is moved over a greater space, whereof that little one is part, must first be moved over that. These small beginnings of Motion, within the body of Man, before they appear in walking, speaking, striking, and other visible actions, are commonly called ENDEAVOUR.

Ibid.

17

This Endeavour, when it is toward something which causes it, is called APPETITE, or DESIRE; the later, being the general name; and the other, oftentimes restrained to signify the Desire of Food, namely Hunger and Thirst. And when the Endeavour is fromward [i.e., distant from] something, it is generally called AVERSION. These words Appetite, and Aversion we have from the Latin; and they both of them signify the motions, one of approaching, the other of retiring. […]

Of Appetites, and Aversions, some are born with men; as Appetite of food, Appetite of excretion, and exoneration, (which may also and more properly be called Aversions, from somewhat they feel in their Bodies;) and some other Appetites, not many. The rest, which are Appetites of particular things, proceed from Experience, and trial of their effects upon themselves, or other men. For of things we know not at all, or believe not to be, we can have no further Desire, than to taste and try. But Aversion we have for things, not only which we know have hurt us; but also that we do not know whether they will hurt us, or not.

Ibid.

18 Quote from, Leviathan (1651), The First Part, Chapter 8, by Thomas Hobbes (with italics and punctuation as in the original but modern spelling).

19 Ibid.

20 Ibid.

21 S. L. A. Marshall findings were complied in a seminal work titled Men Against Fire (1947).

22

In the aftermath of the Battle of Gettysburg, the Confederate Army was in full retreat, forced to abandon all of its dead and most of its wounded. The Union Army and citizens of Gettysburg had an ugly cleanup task ahead of them. Along with the numerous corpses littered about the battlefield, at least 27,574 rifles (I’ve also seen 37,574 listed) were recovered. Of the recovered weapons, a staggering 24,000 were found to be loaded, either 87% or 63%, depending on which number you accept for the total number of rifles. Of the loaded rifles, 12,000 were loaded more than once and half of these (6,000 total) had been loaded between three and ten times. One poor guy had reloaded his weapon twenty-three times without firing a single shot.

From On Killing: The Psychological Cost of Learning to Kill in War and Society (1996) by Dave Grossman

23 The same passage concludes:

Another doctrine repugnant to Civil Society, is, that “Whatsoever a man does against his Conscience, is Sin;” and it dependeth on the presumption of making himself judge of Good and Evil. For a man’s Conscience, and his Judgement is the same thing; and as the Judgement, so also the Conscience may be erroneous. Therefore, though he that is subject to no Civil Law, sinneth in all he does against his Conscience, because he has no other rule to follow but his own reason; yet it is not so with him that lives in a Common-wealth; because the Law is the public Conscience, by which he hath already undertaken to be guided.

Quote from, Leviathan (1651), The Second Part, Chapter 29, by Thomas Hobbes (with italics and punctuation as in the original but modern spelling).

24 Hobbes had actually tried to found his entire philosophy on mathematics but in characteristically contrarian fashion was also determined to prove that mathematics itself was also reducible to materialistic principles. This meant rejecting an entire tradition that began with Euclid and that continues today and which recognises the foundations of geometry lie in abstractions such as points, lines and surfaces. In response to Hobbes, John Wallis, Oxford University’s Savilian Professor of Geometry and founding member of the Royal Society, had publicly engaged with the “pseudo-geometer” in a dispute that raged from 1655 until Hobbes’s death in 1679. To illustrate the problem with Hobbes various “proofs” of unsolved problems including squaring the circle (all of which were demonstrably incorrect), Wallis had asked rhetorically: “Who ever, before you, defined a point to be a body? Who ever seriously asserted that points have any magnitude?”

You can read more about this debate in a paper published by The Royal Society titled Geometry, religion and politics: context and consequences of the Hobbes–Wallis dispute written by Douglas Jesseph, published October 10, 2018. https://doi.org/10.1098/rsnr.2018.0026

25 Quote from, Leviathan (1651), The First Part, Chapter 5, by Thomas Hobbes (with italics and punctuation as in the original but modern spelling).

26 From The Perils of Obedience  (1974) by Stanley Milgram, published in Harper’s Magazine. Archived from the original on December 16, 2010. Abridged and adapted from Obedience to Authority.

27 Ibid.

28 From The Life of the Robin, Fourth Edition (1965), Chapter 15 “A Digression on Instinct” written by David Lack.

29 From Historia Vitae et Mortis by Sir Francis Bacon (‘History of Life and Death’, 1623).

30 Morphological changes such as albinism and loss of sight are common to all cave-dwelling species including invertebrates, fish and also birds. It is presumed that these changes have come about because they save energy and thus confer an evolutionary advantage although biologists find it difficult to explain loss of pigmentation since there seems to be very little energy saved in this way.

31 From a Tanner Lecture on Human Values entitled Morality and the Social Instincts: Continuity with the Other Primates delivered by Frans B. M. Waal at Princeton University on November 19–20, 2003.

The abstract begins:

The Homo homini lupus [“Man is wolf to man.”] view of our species is recognizable in an influential school of biology, founded by Thomas Henry Huxley, which holds that we are born nasty and selfish. According to this school, it is only with the greatest effort that we can hope to become moral. This view of human nature is discussed here as “Veneer Theory,” meaning that it sees morality as a thin layer barely disguising less noble tendencies. Veneer Theory is contrasted with the idea of Charles Darwin that morality is a natural outgrowth of the social instincts, hence continuous with the sociality of other animals. Veneer Theory is criticized at two levels. First, it suffers from major unanswered theoretical questions. If true, we would need to explain why humans, and humans alone, have broken with their own biology, how such a feat is at all possible, and what motivates humans all over the world to do so. The Darwinian view, in contrast, has seen a steady stream of theoretical advances since the 1960s, developed out of the theories of kin selection and reciprocal altruism, but now reaching into fairness principles, reputation building, and punishment strategies. Second, Veneer Theory remains unsupported by empirical evidence.

https://tannerlectures.utah.edu/_documents/a-to-z/d/deWaal_2005.pdf

32 Quote from a NOVA interview, “The Bonobo in All of UsPBS from January 1, 2007.

33 Quote from a NOVA interview, “The Bonobo in All of UsPBS from January 1, 2007.

35 The second stanza of Wallace Steven’s poem Thirteen Ways of Looking at a Blackbird.

36 As he explained in an interview published in the Royal Society of Biology journal The Biologist Vol 60(1) p16-20. https://www.rsb.org.uk/biologist-interviews/richard-dawkins

37 Extracts taken from Chapter 2, pp 45-48, “Seeing Voices” by Oliver Sacks, first published 1989, Picador.

38 Aldous Huxley in the Foreword of ‘The First and Last Freedom’ by Jiddu Krishnamurti.

In his collection of essays Adonis and the Alphabet (1956), the first chapter titled “The Education of an Amphibian” begins as follows:

Every human being is an amphibian— or, to be more accurate, every human being is five or six amphibians rolled into one. Simultaneously or alternately, we inhabit many different and even incommensurable universes. To begin with, man is an embodied spirit. As such, he finds himself infesting this particular planet, while being free at the same time to explore the whole spaceless, timeless world of universal Mind. This is bad enough; but it is only the beginning of our troubles. For, besides being an embodied spirit, each of us is also a highly self-conscious and self-centred member of a sociable species. We live in and for ourselves; but at the same time we live in and, somewhat reluctantly, for the social group surrounding us. Again, we are both the products of evolution and a race of self-made men. In other words, we are simultaneously the subjects of Nature and the citizens of a strictly human republic, which may be anything from what St Paul called ‘no mean city’ to the most squalid of material and moral slums.

39 Also from the first chapter titled “The Education of an Amphibian” of Aldous Huxley’s collection of essays Adonis and the Alphabet (1956).

39a Quote taken from “Rixty Minutes”, Episode 8, Season 1, of adult cartoon Rick and Morty first broadcast by the Cartoon Network on March 17, 2014.

40 The quote is directly addressed to political philosopher and anarchist Pierre-Joseph Proudhon in Chapter 2: “The Metaphysics of Political Economy”; Part 3: “Competition and Monopoly” of Karl Marx’s The Poverty of Philosophy, a critique of the economic and philosophical doctrine of Proudhon, first published in 1847. In full the quote reads:

“M. Proudhon does not know that all history is nothing but a continuous transformation of human nature.”

https://www.marxists.org/archive/marx/works/1847/poverty-philosophy/

41 Quote taken from Episode 3 of Romer’s Egypt first broadcast on BBC TV in 1982.

42 From Christopher Columbus’s log for Friday, Saturday and Sunday October 12 –14, 1492. https://www.americanjourneys.org/pdf/AJ-062.pdf

43 The following are separate entries:

“With my own eyes I saw Spaniards cut off the nose and ears of Indians, male and female, without provocation, merely because it pleased them to do it. …Likewise, I saw how they summoned the caciques and the chief rulers to come, assuring them safety, and when they peacefully came, they were taken captive and burned.”

“They laid bets as to who, with one stroke of the sword, could split a man in two or could cut off his head or spill out his entrails with a single stroke of the pike.”

“They took infants from their mothers’ breasts, snatching them by the legs and pitching them headfirst against the crags or snatched them by the arms and threw them into the rivers, roaring with laughter and saying as the babies fell into the water, ‘Boil there, you offspring of the devil!’”

“They attacked the towns and spared neither the children nor the aged nor pregnant women nor women in childbed, not only stabbing them and dismembering them but cutting them to pieces as if dealing with sheep in the slaughter house.”

“They made some low wide gallows on which the hanged victim’s feet almost touched the ground, stringing up their victims in lots of thirteen, in memory of Our Redeemer and His twelve Apostles, then set burning wood at their feet and thus burned them alive.”

From the History of the Indies (1561) by Bartolome de las Casas.

44 Ibid.

45 As with many of the best known quotes, the first appears to be misattributed and the second is very possibly the reworking of an utterance by Voltaire. While it is true that Napolean is reported as once saying in conversation: “What then is, generally speaking, the truth of history? A fable agreed upon,” the phrase certainly predates him. The first quote “History is written by the winners” can however be traced to the pen of George Orwell from one of a series of articles published by the Tribune under the title “As I please”, in which he wrote:

During part of 1941 and 1942, when the Luftwaffe was busy in Russia, the German radio regaled its home audience with stories of devastating air raids on London. Now, we are aware that those raids did not happen. But what use would our knowledge be if the Germans conquered Britain?  For the purpose of a future historian, did those raids happen, or didn’t they? The answer is: If Hitler survives, they happened, and if he falls they didn’t happen. So with innumerable other events of the past ten or twenty years. Is the Protocols of the Elders of Zion a genuine document? Did Trotsky plot with the Nazis? How many German aeroplanes were shot down in the Battle of Britain? Does Europe welcome the New Order? In no case do you get one answer which is universally accepted because it is true: in each case you get a number of totally incompatible answers, one of which is finally adopted as the result of a physical struggle. History is written by the winners. [bold emphasis added]

46 All excerpts taken from Candide and Other Tales written by Voltaire, translated by T. Smollett, revised by James Thornton, published by J. M. Dent & Sons Ltd, London , first published 1937. Incidentally, my own personal copy of this book was saved from the flames of my parent’s wood-burning stove after I discovered it hidden amongst hundreds of old textbooks and destined to become fuel for their central heating system.

47 All excerpts taken from How Much do You Know? (p. 215) Published by Odhams Press Limited, Long Acre, London. WC2 Date of publication unknown but definitely pre-WWII on basis of, for example, the question “what territory did Germany lose after the World War?” (on p. 164)

48 For instance, in German, Geschichte, in Russian история, and in French histoire.

49 Quote from William Shakespeare’s The Tragedy of King Richard the Second, Act II, Scene 1, spoken by John of Gaunt.

50 In their book Trump and the Puritans (published in 2020), authors James Roberts and Martyn Whittock point to the remarkable coincidence that on almost precisely the 400th anniversary of the landing of the Mayflower at Plymouth Rock, if Donald Trump is to be re-elected it in 2020, then it will be thanks to not only to his strong base amongst Christian Right but down to a more of pervasive and enduring belief in Manifest Destiny, American exceptionalism, the making of the New Jerusalem and “the city on the hill” that can be traced all the way back to the Pilgrim Fathers.

Speaking with host Afshin Rattansi on RT’s Going Underground, Martyn Whittock outlined this thesis, which offers a convincing account for  why so many American Christians support Trump despite his non-religious character traits, and also why there is greater support for Israel amongst Christian evangelicals than American Jews:

51 The quote is taken from Chapter 4: “Of Constitutions”; Part 2 of Thomas Paine’s Rights of Man, a defence of the French Revolution against charges made by Edmund Burke in his Reflections on the Revolution in France (1790). Rights of Man was first published in two parts in 1791 and 1792 respectively.

In fuller context, Paine writes:

Man will not be brought up with the savage idea of considering his species as his enemy, because the accident of birth gave the individuals existence in countries distinguished by different names; and as constitutions have always some relation to external as well as to domestic circumstances, the means of benefitting by every change, foreign or domestic, should be a part of every constitution. We already see an alteration in the national disposition of England and France towards each other, which, when we look back to only a few years, is itself a Revolution. Who could have foreseen, or who could have believed, that a French National Assembly would ever have been a popular toast in England, or that a friendly alliance of the two nations should become the wish of either? It shows that man, were he not corrupted by governments, is naturally the friend of man, and that human nature is not of itself vicious.

http://www.gutenberg.org/files/3742/3742-h/3742-h.htm

52 The Second Law of Thermodynamics can be stated in a variety of different ways but is probably best known as follows: “ that the total entropy of any isolated macroscopic system must always decrease.” Where entropy is the precise measure of something that can be loosely described as the total microscopic disorder within the system. The second law has many implications. Firstly, there is insistence upon a direction whenever any system changes, with order changing into increasingly to disorder. This itself implies an irreversibility to events and suggests a propelling “arrow of time”. The Second Law also prohibits the possibility for any kind of perpetual motion, which by extension, sets a limit to the duration of the universe as a whole, since the universe can also be considered as an isolated thermodynamic system, and is therefore, and as a whole, subject to the Second Law. For this reason the universe is now expected to end in a cosmic whimper, known in Physics as “the heat death of the universe” – with all parts having reached a very chilly thermodynamic equilibrium. It almost seems then that the Second Law of Thermodynamics might be the physical axis about which the diabolical asymmetry of destruction over creation is strung. Just how any universe of intricate complexity could ever have formed in the first instance is mysterious enough, and though the Second Law of Thermodynamics does not prohibit all orderly formation, so long as the pockets of order are counterbalanced by regions of increasing chaos, the law does maintain that the overall tendency is always towards disorder. Form it did, of course, which perhaps implies the existence of an as yet undiscovered but profoundly forceful creative principle – something that may prove to be nothing more or less than another law of thermodynamics.

Here is physicist Richard Feynman wondering about the physical cause of irreversibility and what it tells us about the past:

53

We are survival machines – robot vehicles blindly programmed to preserve the selfish molecules known as genes. This is a truth which still fills me with astonishment.

From The Selfish Gene by Richard Dawkins.

54 This variant on the myth, with its rather Buddhist overtones, does at least account for God’s rage and instant reaction. For according to Genesis, God thereafter says, to no-one in particular: “… the man is become as one of us [sic], to know good from evil.” Our expulsion from the Garden of Eden is not simply His punishment for our disobedience (which is, of course, the doctrine the church authorities are keen to play up), but a safeguard to protect and secure His own divine monopoly. God fearing that left alone in paradise we might now, and as the same passage goes on to elucidate, “take also of the tree of life, and eat, and live for ever.”

Extracts taken from Genesis 3:22. The full verse is as follows: “And the Lord God said, Behold, the man is become as one of us, to know good and evil: and now lest he put forth his hand, and take also of the tree of life, and eat, and live for ever:”

55L’hypocrisie est un hommage que le vice rend à la vertu.” – François de La Rochefoucauld, Maximes (1665–1678), 218.

Alternative translation: “Hypocrisy is a tribute vice pays to virtue.”

56

L’homme est né libre, et partout il est dans les fers. Tel se croit le maître des autres, qui ne laisse pas d’être plus esclave qu’eux.

Translated by G. D. H. Cole (1913) as: “Man is born free; and everywhere he is in chains. One thinks himself the master of others, and still remains a greater slave than they.”

From Part I, Chapter 1 of Du contrat social ou Principes du droit politique [trans: Of The Social Contract, Or Principles of Political Right ] (1762) by Jean-Jacques Rousseau. is a book in which Rousseau theorized about the best way to establish a political community.

57 Translated by Samuel Moore in cooperation with Frederick Engels (1888):

The proletarians have nothing to lose but their chains. They have a world to win. Working Men of All Countries, Unite!

From Section 4, paragraph 11 of Das Manifest der Kommunistischen Partei [trans: The Communist Manifesto] (1848) by Karl Marx and Friedrich Engels

57a This was first observed by primatologist Jane Goodall when she observed what happened after the splintering of a community of chimpanzees in Gombe Stream National Park in Tanzania. Over the next four years the adult males of the separatists were systematically killed one-by-one by members of the remaining original group. Jane Goodall was profoundly disturbed by this revelation and wrote in her memoir Through a Window: My Thirty Years with the Chimpanzees of Gombe:

For several years I struggled to come to terms with this new knowledge. Often when I woke in the night, horrific pictures sprang unbidden to my mind—Satan [one of the apes], cupping his hand below Sniff’s chin to drink the blood that welled from a great wound on his face; old Rodolf, usually so benign, standing upright to hurl a four-pound rock at Godi’s prostrate body; Jomeo tearing a strip of skin from Dé’s thigh; Figan, charging and hitting, again and again, the stricken, quivering body of Goliath, one of his childhood heroes.

58 From “Bible Studies” published in Thomas Lynch’s collection of essays titled Bodies in Motion and At Rest (2011).

59

Stanley Moon [Dudley Moore]: If it hadn’t been for you… we’d still be blissfully wandering about naked in paradise.

George Spiggott aka The Devil [Peter Cook]: You’re welcome, mate. The Garden of Eden was a boggy swamp just south of Croydon. You can see it over there.

Stanley Moon: Adam and Eve were happy enough.

The Devil: I’ll tell you why… they were pig ignorant.

From the 1967 British comedy Bedazzled, directed and produced by Stanley Donen, screenplay by Peter Cook.

Transcript is available here: https://www.scripts.com/script.php?id=bedazzled_3792&p=11

60 From an article titled “shame v. guilt’ by Brené Brown, published on her own website on January 14, 2013. https://brenebrown.com/blog/2013/01/14/shame-v-guilt/

61 The quote comes from Sartre’s play No Exit [French: Huis clos] first performed in 1944. Three characters find themselves trapped and forever waiting in a mysterious room which depicts the afterlife. The famous phrase “L’enfer, c’est les autres” or “Hell is other people” is a reference to Sartre’s idea that seeing oneself as apprehended by and thus the object of another person’s view of conscious awareness involves a perpetual ontological struggle.

It seems that Sartre offered his own clarification, saying:

“Hell is other people” has always been misunderstood. It has been thought that what I meant by that was that our relations with other people are always poisoned, that they are invariably hellish relations. But what I really mean is something totally different. I mean that if relations with someone else are twisted, vitiated, then that other person can only be hell. Why? Because … when we think about ourselves, when we try to know ourselves … we use the knowledge of us which other people already have. We judge ourselves with the means other people have and have given us for judging ourselves.

The quote above is from a talk that preceded a recording of the play issued in 1965. http://rickontheater.blogspot.com/2010/07/most-famous-thing-jean-paul-sartre.html

62 Quote from the Aldous Huxley’s collection of essays Adonis and the Alphabet (1956), Chapter 2 titled “Knowledge and Understanding”.

63 Aristotle, Politics, Book 1, section 1253a

64 From “An Essay on the Principle of Population: as it affects the future improvement of society with remarks on the speculations of Mr. Godwin, M. Condorcet, and other writers” by Thomas Robert Malthus (1798), chapter 1.

65 Ibid.

66 “Taking the population of the world at any number, a thousand millions, for instance, the human species would increase in the ratio of — 1, 2, 4, 8, 16, 32, 64, 128, 256, 512, etc. and subsistence as — 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, etc. In two centuries and a quarter, the population would be to the means of subsistence as 512 to 10: in three centuries as 4096 to 13, and in two thousand years the difference would be almost incalculable, though the produce in that time would have increased to an immense extent.” is a prediction taken from chapter 2 of “An Essay on the Principle of Population…” by T. Malthus (1798). Okay then, here’s the maths: Malthus is assuming a population exponentially doubling in 25 years (every generation). In two and a quarter centuries this would allow 9 generations, so 2 to the power of 9 increase, which represents a 512-fold increase as he correctly claims. Well, what actually happened? At the time of Thomas Malthus, Britain also conducted its first census recording in 1801 a population of 8,308,000 (which is thought likely to have been an under-estimate). Meanwhile, the world population is estimated to have just reached around 1 billion (precisely as Malthus estimates). So then, according to Malthus calculations, the population of Britain should now be more than 4 billion! (which is approaching close to the current global population) Taking the same approach, the population of the world should now have exploded past half a trillion! This is at the extreme upper limit of estimates for the Earth’s carrying capacity: “The estimates of the Earth’s carrying capacity range from under 1 billion to more than 1,000 billion persons. Not only is there an enormous range of values, but there is no tendency of the values to converge over time; indeed, the estimates made since 1950 exhibit greater variability than those made earlier.” from UN World Population Report 2001, p.30.

67 Now known as The Royal Statistic Society (after receiving Royal Charter in 1887)

68 Letter sent to Tennyson in response to his poem “Vision of Sin” published 1842. The exact details of this letter seem to vary according to sources. In another version he signs off saying, “Strictly speaking, the actual figure is so long I cannot get it into a line, but I believe the figure 1 1/16 will be sufficiently accurate for poetry.”

69

After 30 years of rapid growth in agricultural production, the world can produce enough food to provide every person with more than 2 700 Calories per day level which is normally sufficient to ensure that all have access to adequate food, provided distribution is not too unequal.

From report of World Food Summit of FAO (Rome 13-17 November 1996), entitled Food for All.

70

“[However,] the slowdown [of worldwide agricultural production] has occurred not because of shortages of land or water but rather because demand for agricultural products has also slowed. This is mainly because world population growth rates have been declining since the late 1960s, and fairly high levels of food consumption per person are now being reached in many countries, beyond which further rises will be limited.” – “This study suggests that world agricultural production can grow in line with demand, provided that the necessary national and international policies to promote agriculture are put in place. Global shortages are unlikely, but serious problems already exist at national and local levels and may worsen unless focused efforts are made.” – “Agricultural production could probably meet expected demand over the period to 2030 even without major advances in modern biotechnology.”

Extracts from the Executive Summary of the FAO summary report World agriculture: towards 2015/2030, published in 2002.

71 Maslow’s ideas have fallen by the wayside, which is a pity because his study of human need was a worthwhile project. Maslow’s reductionism is wrong, but perhaps by considering a more intricate and dynamic interconnectedness between human needs, his theory can be usefully revised. The trouble with Maslow is any insistence on hierarchy, something that other academics, and especially those working in the social sciences, are inclined to mistake as a kind of verified truth. Just calling an idea, ‘a theory’, doesn’t make it so, certainly not in any rigorous sense, but those not trained in the hard sciences are often inclined to treat speculative formulations as though they are fully-fledged theories. This is grave and recurring error infuriates many people, myself included, and especially those who have received specialist scientific training.

72 All subsequent passages and quotations in this chapter are also taken from “An Essay on the Principle of Population: as it affects the future improvement of society with remarks on the speculations of Mr. Godwin, M. Condorcet, and other writers” by Thomas Robert Malthus (1798), chapters 18 and 19.

73 His ideas on these daunting topics are rather cleverly-conceived, unusual if not wholly original, and tread a line that is unorthodox and close to being heretical. So it’s really in these closing chapters that Malthus is most engaging and most at ease. Here, for example, is the Malthusian take on mind and matter:

It could answer no good purpose to enter into the question whether mind be a distinct substance from matter, or only a finer form of it. The question is, perhaps, after all, a question merely of words. Mind is as essentially mind, whether formed from matter or any other substance. We know from experience that soul and body are most intimately united, and every appearance seems to indicate that they grow from infancy together… As we shall all be disposed to agree that God is the creator of mind as well as of body, and as they both seem to be forming and unfolding themselves at the same time, it cannot appear inconsistent either with reason or revelation, if it appear to be consistent with phenomena of nature, to suppose that God is constantly occupied in forming mind out of matter and that the various impressions that man receives through life is the process for that purpose. The employment is surely worthy of the highest attributes of the Deity.

Having safely negotiated the potential minefield of Cartesian dualism, Malthus now applies himself to the tricky problem of evil, and its relationship to “the wants of the body”:

The first great awakeners of the mind seem to be the wants of the body… The savage would slumber for ever under his tree unless he were roused from his torpor by the cravings of hunger or the pinchings of cold, and the exertions that he makes to avoid these evils, by procuring food, and building himself a covering, are the exercises which form and keep in motion his faculties, which otherwise would sink into listless inactivity. From all that experience has taught us concerning the structure of the human mind, if those stimulants to exertion which arise from the wants of the body were removed from the mass of mankind, we have much more reason to think that they would be sunk to the level of brutes, from a deficiency of excitements, than that they would be raised to the rank of philosophers by the possession of leisure.

74 Malthus, aware of the dangers of over-generalisation, adds a little later that:

There are undoubtedly many minds, and there ought to be many, according to the chances out of so great a mass, that, having been vivified early by a peculiar course of excitements, would not need the constant action of narrow motives to continue them in activity.” Saying later again that: “Leisure is, without doubt, highly valuable to man, but taking  man as he is, the probability seems to be that in the greater number of instances it will produce evil rather than good.

75Essais de Théodicée sur la bonté de Dieu, la liberté de l’homme et l’origine du mal ” (more simply known as Théodicée) which translates from French as “Essays of theodicy on the goodness of God, the freedom of man and the origin of evil”.

76 Malthus also offers us reasons to be cheerful and indeed grateful for our world of apparent imperfection:

Uniform, undiversified perfection could not possess the same awakening powers. When we endeavour then to contemplate the system of the universe, when we think of the stars as the suns of other systems scattered throughout infinite space, when we reflect that we do not probably see a millionth part of those bright orbs that are beaming light and life to unnumbered worlds, when our minds, unable to grasp the immeasurable conception, sink, lost and confounded, in admiration at the mighty incomprehensible power of the Creator, let us not querulously complain that all climates are not equally genial, that perpetual spring does not reign throughout the year, that all God’s creatures do not possess the same advantages, that clouds and tempests sometimes darken the natural world and vice and misery the moral world, and that all the works of the creation are not formed with equal perfection. Both reason and experience seem to indicate to us that the infinite variety of nature (and variety cannot exist without inferior parts, or apparent blemishes) is admirably adapted to further the high purpose of the creation and to produce the greatest possible quantity of good.

77

This view of the state of man on earth will not seem to be unattended with probability, if, judging from the little experience we have of the nature of mind, it shall appear upon investigation that the phenomena around us, and the various events of human life, seem peculiarly calculated to promote this great end, and especially if, upon this supposition, we can account, even to our own narrow understandings, for many of those roughnesses and inequalities in life which querulous man too frequently makes the subject of his complaint against the God of nature.

Taken from Chapter 18. Ibid.

78 There are of course modern reinventions of the Malthusian message, which are still play a significant role in our current political debate. These depend on extending Malthus’ idea into considerations of resource shortages of other kinds such as energy (and after all, food is the primary form of energy for human beings) and water. This however is an area that I wish to save for future writing.

Leave a comment

Filed under « finishing the rat race »

the life lepidopteran

The following article is an Interlude between Parts I and II of a book entitled Finishing The Rat Race

All previously uploaded chapters are available (in sequence) by following the link above or from category link in the main menu, where you will also find a table of contents and a preface on why I started writing it.

*

“Once upon a time, I, Chuang Chou, dreamt I was a butterfly, fluttering hither and thither, to all intents and purposes a butterfly. I was conscious only of my happiness as a butterfly, unaware that I was Chou. Soon I awaked, and there I was, veritably myself again. Now I do not know whether I was then a man dreaming I was a butterfly, or whether I am now a butterfly, dreaming I am a man.”

— Chuang Tzu 1

*

Before proceeding further, I’d like to tell a joke:

A man walks into a doctor’s.

“Doctor, Doctor, I keep thinking I’m a moth,” the man says.

The doctor gives him a serious look. “Sorry, but I am not strictly qualified to help you” he replies, rubbing his chin earnestly before adding after a momentary pause, “You really need to see a psychiatrist.”

“Yes,” says the man, “but your light was on.”

*

There can be no doubting that each of us acts to a considerable extent in accordance to mental processes that are distantly beyond and often alien to our immediate conscious awareness and understanding. For instance, in general we draw breath without the least consideration, or raise an arm, perhaps to scratch ourselves, with scarcely a thought and zero comprehension of how we actually moved our hand and fingers to accomplish the act. And this everyday fact becomes more startling once we consider how even complex movements and sophisticated patterns of behaviour seem to originate without full conscious direction or awareness.

Consider walking for instance. After admittedly painstaking practice as infants, we soon become able to walk without ever thinking to swing our legs. Likewise, if we have learnt to drive, eventually we are able to manoeuvre a large vehicle with hardly more conscious effort that we apply to walking. The same is true for most daily tasks which are performed no less thoughtlessly and that, in spite of the intricacies, we often find boring and mundane. For instance, those who have been smokers may be able to perform the rather complicated art of rolling a cigarette without pausing from conversation. Indeed, deep contemplation will probably leave us more bewildered than anything by the mysterious coordinated manipulation of all eight fingers and opposing thumbs.

Stranger still is that our ordinary conversational speech proceeds before we have formed the fully conscious intent to utter our actual words! When I first heard this claim, it struck me as so unsettling that I automatically rejected it outright in what ought perhaps to be called a tongue-jerk reaction. (Not long afterwards I was drunk enough to stop worrying about the latent implications!) For considered dispassionately, it is self-evident that there isn’t remotely sufficient time to construct each and every utterance consciously and in advance of the act of speaking; so our vocal ejaculations (as they once were unashamedly called) are just that – they are thrown out! Still further proof is provided by instances when gestures or words emerge in direct conflict to our expressed beliefs and ideas. Those embarrassing occasions when we blurt out what we know must never be spoken we call Freudian slips (and more on Freud below).

More positively, and especially when we enter ‘the zone’, each of us is able to accomplish complex physical acts – for instance throwing, catching, or kicking a ball – and again before any conscious thought arises to do so. Those who have played a sport long enough can probably recall many joyous moments when they have marvelled not only at their own impossible spontaneity, but the accompanying accuracy, deftness, nimbleness, and on very rare occasions even of enhanced physical strength. Likewise, urges, feelings, fears and sometimes the most profound insights will suddenly spring forth into “the back of our minds”, as if from nowhere. And as a consequence, this apparent nowhere acquired a name: coming to be known as “the preconscious”, “the subconscious” and more latterly, “the unconscious”.

What this means, of course, is that “I” am not what I ordinarily think I am, but in actuality a lesser aspect of a greater being who enjoys remarkable talents and abilities beyond what are ordinarily thought “my own” since they lie outside “my” immediate grasp. In this way, we all have hidden depths that can and do give rise to astonishment, although for peculiar reasons of pride, we tend in general to feign ignorance of this everyday fact.

*

The person most popularly associated with the study of the human unconscious is Sigmund Freud, a pioneer in the field but by no means a discoverer. In fact philosopher and all-round genius Gottfried Leibniz is someone with a prior claim to the discovery; making the suggestion that our conscious awareness may be influenced by “insensible stimuli” that he called petites perceptions 1. Another giant of German philosophy, Immanuel Kant, also subsequently proposed the existence of ideas lurking of which we are not fully aware, while admitting the apparent contradiction inherent in such a conjecture:

“To have ideas, and yet not be conscious of them, — there seems to be a contradiction in that; for how can we know that we have them, if we are not conscious of them? Nevertheless, we may become aware indirectly that we have an idea, although we be not directly cognizant of the same.” 2

Nor is it the case that Freud was first in attempting any kind of formal analysis of the make-up and workings of the human psyche as an entity. Already in 1890, William James had published his own ground-breaking work Principles of Psychology, and though James was keen to explore and outline his principles for human psychology by “the description and explanation of states of consciousness”, rather than to plunge more deeply into the unknown, he remained fully aware of the potentiality of unconscious forces and made clear that any “‘explanation’ [of consciousness] must of course include the study of their causes, conditions and immediate consequences, so far as these can be ascertained.” 3

*

William James’ own story is both interesting and instructive. As a young man he had been at somewhat of a loss to decide what to do with himself. Having briefly trained as an artist, he quickly realised that he’d never be good enough and became disillusioned with the idea, declaring that “there is nothing on earth more deplorable than a bad artist”. He afterwards retrained in chemistry, enrolling at Harvard in 1861 (a few months after the outbreak of the American Civil War), but restless again, twelve months or so later, transferred to biology. Still only twenty-one, James soon felt that he was running out of options, writing in a letter to his cousin:

“I have four alternatives: Natural History, Medicine, Printing, Beggary. Much may be said in favour of each. I have named them in the ascending order of their pecuniary invitingness. After all, the great problem of life seems to be how to keep body and soul together, and I have to consider lucre. To study natural science, I know I should like, but the prospect of supporting a family on $600 a year is not one of those rosy dreams of the future with which the young are said to be haunted. Medicine would pay, and I should still be dealing with subjects which interest me – but how much drudgery and of what an unpleasant kind is there!”

Three years on, James then entered the Harvard Medical School, where he quickly became disillusioned. Certain that he no longer wished to become a practicing doctor, and being more interested in psychology and natural history than medicine, a fresh opportunity arose, and he soon set sail to the Amazon in hopes of becoming a naturalist. However, the expedition didn’t work out well either. Fed up with collecting bugs and bored with the company of his fellow explorers, to cap everything, he fell quite ill. Although desperate to return home, he was obliged to continue, and, slowly he regained his strength, deciding that in spite of everything it had been a worthwhile diversion; no doubt heartened too by the prospect of finally returning home.

It was 1866, when James next resumed medical studies at Harvard although the Amazon adventure had left him physically and (very probably) psychologically weakened; a continuing sickness that forced James to break off from his studies yet again. Seeking rest and recuperation, for the next two years James sojourned in Europe, where, to judge from his own accounts, he again experienced a great deal of isolation, loneliness and boredom. Returning to America at the end of 1868 – now approaching twenty-seven years old – he picked up his studies at Harvard for the last time, successfully passing his degree to become William James M.D. in 1869.

Too weak to find work anyway, James had stayed resolute in his unwillingness to become a practicing doctor. So for a prolonged period, he did nothing at all, or next to nothing. Three years passed when, besides the occasional publication of articles and reviews, he devoted himself solely to reading books or thinking thoughts, and often quite gloomy ones. Suddenly, one day, he then had a semi-miraculous revelation: a very dark revelation that made him exceedingly aware not only of his own mental fragility, but the likely prognosis:

“Whilst in this state of philosophic pessimism and general depression of spirits about my prospects, I went one evening into the dressing room in the twilight… when suddenly there fell upon me without any warning, just as if it came out of the darkness, a horrible fear of my own existence. Simultaneously there arose in my mind the image of an epileptic patient whom I had seen in the asylum, a black-haired youth with greenish skin, entirely idiotic, who used to sit all day on one of the benches, or rather shelves, against the wall, with his knees drawn up against his chin, and the coarse gray undershirt, which was his only garment, drawn over them, inclosing his entire figure. He sat there like a sort of sculptured Egyptian cat or Peruvian mummy, moving nothing but his black eyes and looking absolutely non-human. This image and my fear entered into a species of combination with each other. That shape am I, I felt, potentially. Nothing that I possess can defend me against that fate, if the hour for it should strike for me as it struck for him. There was such a horror of him, and such a perception of my own merely momentary discrepancy from him, that it was as if something hitherto solid within my breast gave way entirely, and I became a mass of quivering fear. After this the universe was changed for me altogether. I awoke morning after morning with a horrible dread at the pit of my stomach, and with a sense of the insecurity of life that I never knew before, and that I have never felt since. It was like a revelation; and although the immediate feelings passed away, the experience has made me sympathetic with the morbid feelings of others ever since.” 4

Having suffered what today would very likely be called ‘a nervous breakdown’, James was forced to reflect on the current theories of the mind. Previously, he had accepted the materialist ‘automaton theory’ – that our ability to act upon the world depends not upon conscious states as such, but upon the brain-states that underpin and produce them – but now he felt that if true this meant he was personally trapped forever in a depression that could only be cured by the administering of some kind of physical remedy. However, no such remedy was obtainable, and so he was forced instead to tackle his disorder by means of further introspection and self-analysis.

James read more and thought more since there was nothing else he could do. Three more desperately unhappy years would pass before he had sufficiently recuperated to rejoin the ordinary world, accepting an offer to become lecturer in physiology at Harvard. But as luck would have it, teaching suited James. He enjoyed the subject of physiology itself, and found the activity of teaching “very interesting and stimulating”. James had, for once, landed on his feet, and his fortunes were also beginning to improve in other ways.

Enjoying the benefits of a steady income for the first time in his life, he was soon to meet Alice Gibbons, the future “Mrs W.J.” They married two years later in 1878. She was a perfect companion – intelligent, perceptive, encouraging, and perhaps most importantly for James, an organising force in his life. He had also just been offered a publishing contract to write a book on his main specialism, which was by now – and in spite of such diversity of training – most definitely psychology. With everything now in place, James set to work on what would be his magnus opus. Wasting absolutely no time whatsoever, the opening chapters were drafted while still on their honeymoon together.

“What is this mythological and poetical talk about psychology and Psyche and keeping back a manuscript composed during honeymoon?” he wrote in jest to the taunts of a friend, “The only psyche now recognized by science is a decapitated frog whose writhings express deeper truths than your weak-minded poets ever dreamed. She (not Psyche but the bride) loves all these doctrines which are quite novel to her mind, hitherto accustomed to all sorts of mysticisms and superstitions. She swears entirely by reflex action now, and believes in universal Nothwendigkeit. [determinism]” 5

It would take James more than a decade to complete what quickly became the definitive university textbook on the subject, ample time for such ingrained materialist leanings to have softened. For the most part sticking to what was directly and consciously known to him, his attempts to dissect the psyche involved much painstaking introspection of what he famously came to describe as his (and our) “stream of consciousness”. Such close analysis of the subjective experience of consciousness itself had suggested to James the need to distinguish between “the Me and the I” as separate component parts of what in completeness he called “the self”. 6 In one way or another, this division of self into selves, whether these be consciously apprehensible or not, has remained a theoretical basis of all later methods of psychoanalysis.

There is a joke that Henry James was a philosopher who wrote novels, whereas his brother William was a novelist who wrote philosophy. But this does WJ a disservice. James’ philosophy, known as pragmatism, is a later diversion. Unlike his writings about psychology, which became the standard academic texts, as well as popular best-sellers (and what better tribute to James’s fluid prose); his ideas on pragmatism were rather poorly received (they have gained more favour over time). But then James was a lesser expert in philosophy, a situation not helped by his distaste for logical reasoning; and he would be better remembered for his writings on psychology, a subject in which he excelled. Freud’s claim to originality is nothing like as foundational.

James was at the vanguard during the period psychology irreparably pulled apart from the grip philosophy had held on it (which explains why James was notionally Professor of Philosopher at the time he was writing), and as it was grafted back to form a subdiscipline of biology. For this reason, and regardless that James remained as highly critical of the developing field of experimental psychology; as he was too of the deductive reasoners on both sides of the English Channel – the British Empiricists of Locke and Hume, and the continental giants Leibnitz, Kant and Hegel – to some of his contemporaries, James’ view appeared all too dangerously materialistic. If only they could have seen how areas of psychology were to so ruinously develop, they would have appreciated that James was, as always, a moderate.

*

While James had remained an academic throughout his whole life, Freud, though briefly studying zoology at the University of Vienna, with one month spent unsuccessfully searching for the gonads of the male eel 7, and another spell doing neurology, decided then to return to medicine and open his own practice. He had also received expert training in the new-fangled techniques of hypnosis.

‘Hypnosis’ comes from the Greek hupnos and means, in effect, “artificial sleep”. To induce hypnosis, the patient’s conscious mind needs to be distracted briefly, and achieving this opens up regions of the mind beyond the usual conscious states. The terms “sub-conscious” and “unconscious” had been in circulation already and prior to the theories of Freud or James. And whether named or not, mysterious evidence of the unconscious had always been known. Dreams, after all, though we consciously experience them, are neither consciously conceived nor willed. They just pop out from nowhere – or from “the unconscious”.

From his clinical experiences, Freud soon discovered what he believed to be better routes to the unconscious than hypnosis. For instance, he found that it was just as effective to listen to his patients, or if their conscious mind was unwilling to give up some of its defences – as it commonly was – then to encourage their free association of words and ideas. He also looked for unconscious connections within his patients’ dreams, gradually uncovering, what he came to believe were the deeply repressed animalistic drives that govern the patient’s fears, attitudes and behaviour. Having found the unconscious root to their problems, the patient could finally begin to grapple with these repressed issues at an increasingly conscious level. It was a technique that apparently worked, with many of Freud’s patients recovering from the worst effects of their neuroses and hysteria, and so “the talking cure” became a lasting part of Freud’s legacy. You lay on the couch, and just out of sight, Freud listened and interpreted.

But Freud also left a bigger mark, by helping to shape the way we see ourselves. The types of unconscious repression he discovered in his own patients, he believed were universally present, and through drawing directly on his experiences as doctor, he slowly excavated, as he found it, the entire human unconscious piece by piece. Two of these aspects he labelled as the ‘superego’ and the ‘id’: the one a seat of primal desires, the other a chastising moral guide – these are reminiscent of the squabbling devil-angel duo that pop up in cartoons, jostling for attention on opposite shoulders of the character whenever he’s plunged into a moral quandary. 8

In a reboot of philosopher Arthur Schopenhauer’s concept of blind and insatiable ‘will’, Freud proposed the existence of the libido: a primary, sexual drive that ceaselessly operates beneath our conscious awareness, prompting desires for pleasure and avoidance of pain irrespective of consequence and regardless to whether these desires conflict with ordinary social conventions. In concert with all of this, Freud discerned a natural process of psychological development 9 and came to believe that whenever this development is arrested or, more generally, whenever normal appetites are consciously repressed, then lurking deep within the unconscious, such repressed but instinctual desires will inevitably and automatically resurface in more morbid forms. This, he determined, the common root cause of all his patient’s various symptoms and illnesses.

Had Freud stopped there, his contribution to psychology would have been fully commendable, for there is tremendous insight in these ideas. He says too much no doubt (especially when it comes to the specifics of human development), but he also says something that needed to be said very urgently: that if you force people to behave against their natures you will make them sick. So it seems a pity that Freud carried some of the ideas a little too far.

Let’s take the ‘Oedipus complex’, which of the many Freudian features of our supposed psychological nether regions, is without doubt the one of greatest notoriety. The myth of Oedipus is enthralling; the eponymous hero compelled to deal with fate, misfortune and prophesy. 10 Freud finds in this tale, a revelation of deep and universal unconscious repression, and though plausible and intriguing, his interpretation basically narrows its far grander scope:

“[Oedipus’s] destiny moves us only because it might have been ours – because the Oracle laid the same curse upon us before our birth as upon him. It is the fate of all of us, perhaps, to direct our first sexual impulse towards our mother and our first hatred and our first murderous wish against our father. Our dreams convince us that this is so.”11

Freud generally studied those with minor psychological problems (and did not deal with cases of psychosis), determining on the basis of an unhappy few, what he presumed true for healthier individuals too, and this is perhaps a failure of all psychoanalytic theories. For though it may seem odd that he came to believe in the universality of the Oedipus Complex, who can doubt that his clients didn’t suffer from something like it? Who can doubt that Freud didn’t suffer the same dark desires? Perhaps, he also felt a ‘castration anxiety’ as a result of the Oedipal rivalry he’d had with his own father. Maybe he actually experienced ‘penis envy’, if not of the same intensity as he said he detected in his female patients, but of a compensatory masculine kind! After all, such unconscious ‘transference’ of attitudes and feelings from one person to another – from patient onto the doctor, or vice versa in this relevant example – is another concept that Freud was first to identify and label.

*

Given the strait-laced age in which Freud had fleshed out his ideas, the swiftness with which these theories received widespread acceptance and acclaim seems surprising, although there are surely two good reasons why Freudianism took hold. The first is straightforward: that society had been very badly in need of a dose of Freud, or something very like Freud. After such excessive prudishness, the pendulum was bound to swing the other way. But arguably the more important reason – indeed the reason his theories have remained influential – is that Freud picked up the baton directly from where Darwin left off. By restricting his explanations to biological instincts and drives, Freudianism has the mantle of scientific legitimacy, and this is a vital determining factor that helped to secure its prominent position within the modern epistemological canon.

Following his precedent, students of Freud, most notably Carl Jung and Alfred Adler, also drew on clinical experiences with their own patients, but gradually came to the conclusion, for different reasons, that Freud’s approach was too reductionist, and that there is considerably more to a patient’s mental well-being than healthy appetites and desires, and thus more to the psychological underworld than solely matters of sex and death.

Where Freud was a materialist and an atheist, Jung went on to incorporate aspects of the spiritual into his extended theory of the unconscious, though he remained respectful to biology and keen to anchor his own theories upon an evolutionary bedrock. Jung nevertheless speculates following a philosophical tradition that owes much to Immanuel Kant, while also drawing heavily on personal experience, and comes to posit the existence of psychical structures he calls ‘archetypes’ operating again at the deepest levels within a collective unconscious; a shared characteristic due to our common ancestry.

Thus he envisions ‘the ego’ – the aspect of our psyche we identify as “I” – as existing in relation to an unknown and finally unknowable sea inhabited by autonomous entities which have their own life. Jung actually suggests that Freud’s Oedipus complex is just one of these archetypes, while he finds himself drawn by the bigger fish of the unconscious beginning with ‘The Shadow’ – what is hidden and rejected by the ego – and what he determines are the communicating figures of ‘Animus/Anima’ (or simply ‘The Syzygy’) – a compensatory masculine/feminine unconscious presence within, respectively, the female and male psyche – that prepare us for incremental and never-ending revelations of our all-encompassing ‘Self’.

This lifelong psychical development, or ‘individuation’, was seen by Jung as an inherently religious quest and he is unapologetic in proclaiming so; the religious impulse being a product too of human evolutionary development along with opposable thumbs and upright posture. More than a mere vestigial hangover, religion is, Jung says, fundamental to the deep nature of our species.

Unlike Freud, Jung was also invested in understanding how the human psyche varies greatly from person to person, and to these ends introduced new ideas about character types, adding ‘introvert’ and ‘extrovert’ to the psychological lexicon to draw a division between individuals characterised either by primarily subjective or objective orientations to life – an introvert himself, Jung was able to observe such a clear distinction. Meanwhile, greatly influenced by Friedrich Nietzsche’s “will to power”, Adler switched attention to issues of social identity and specifically to why people felt – in very many cases quite irrationally – inferior or superior amongst their peers. These efforts culminated in the development of his theory of the ‘inferiority complex’ – which might also be thought of as an aspect of the Jungian ‘Shadow’.

These different schools of psychoanalysis are not irreconcilable. They are indeed rather complimentary in many ways: Freud tackling the animal craving and want of pleasure; Jung looking for expression above and beyond what William Blake once referred to as “this vegetable world”; and Adler delving most directly into the mud of human relations, the pervasive urge to dominate and/or be submissive, and the consequences of personal trauma associated with interpersonal and societal inequalities.

Freud presumes that since we are biological products of Darwinian evolution, then our minds have been evolutionarily pre-programmed. Turning the same inquiry outward, Jung goes in a search of common symbolic threads within mythological and folkloric traditions, enlisting these as evidence for the psychological archetypes buried deep within us all. And though Jung held no orthodox religious views of his own, he felt comfortable drawing upon religious (including overtly Christian) symbolism. In one of his most contemplative passages, he wrote:

Perhaps this sounds very simple, but simple things are always the most difficult. In actual life it requires the greatest art to be simple, and so acceptance of oneself is the essence of the moral problem and the acid test of one’s whole outlook on life. That I feed the beggar, that I forgive an insult, that I love my enemy in the name of Christ—all these are undoubtedly great virtues. What I do unto the least of my brethren, that I do unto Christ.

But what if I should discover that the least amongst them all, the poorest of all beggars, the most impudent of all offenders, yea the very fiend himself—that these are within me, and that I myself stand in need of the alms of my own kindness, that I myself am the enemy who must be loved—what then? Then, as a rule, the whole truth of Christianity is reversed: there is then no more talk of love and long-suffering; we say to the brother within us “Raca,” and condemn and rage against ourselves. We hide him from the world, we deny ever having met this least among the lowly in ourselves, and had it been God himself who drew near to us in this despicable form, we should have denied him a thousand times before a single cock had crowed. 12

Of course, “the very fiend himself” is the Jungian ‘Shadow’, the contents of which without recognition and acceptance then inevitably remain repressed, causing these unapproachable and rejected aspects of our own psyche to be projected out on to the world. ‘Shadow projection’ onto others fills the world with enemies of our own imagining; and this, Jung believed, was the root of nearly all evil. Alternatively, by taking Jung’s advice and accepting “that I myself am the enemy who must be loved”, we come back to ourselves in wholeness. It is only then that the omnipresent threat of the Other diminishes, as the veil of illusion forever separating the ego and reality is thinned. And Jung’s psychological reunification also grants access to previously concealed strengths (the parts of the unconscious discussed at the top), further enabling us to reach our fullest potential. 13

Today there are millions doing “shadow work” as it is now popularly known: self-help exercises often combined with traditional practices of yoga, meditation or the ritual use of entheogens: so here is a new meeting place – a modern mash-up – of religion and psychotherapy. Quietly and individually, a shapeless movement has arisen almost spontaneously as a reaction to the peculiar rigours of western civilisation. Will it change the world? For better or worse, it already has.

Alan Watts who is best known for his Western interpretations of Eastern spiritual traditions and in particular Zen Buddhism and Daoism, here reads this same influential passage from one of Jung’s lectures in which he speaks of ending “the inner civil war”:

*

Now what about my joke at the top? What’s that all about? Indeed, and in all seriousness, what makes it a joke at all? Well, not wishing to delve deeply into theories of comedy, there is one structure that arises repeatedly and nearly universally: that the punch line to every joke relies on some kind of unexpected twist on the set up.

To illustrate the point, let’s turn to the most hackneyed joke of all: “Why did the chicken cross the road?” Here we find an inherent ambiguity that lies within use of the word ‘why’ and this is what sets up the twist. However, in the case of the joke about the psychiatrist and the man who thinks he’s a moth, the site of ambiguity isn’t so obvious. But here the humour I think comes down to alternative and finally conflicting notions of ‘belief’.

A brief digression then: What is belief? To offer a salient example, when someone tells you “I believe in God”, what are they intending to communicate? No less importantly, what would you take them to mean? Put differently, atheists will very often say “I don’t believe in anything” – so again, what are they (literally) trying to convey here? And what would a listener take them to mean? Because in all these instances the same word is used to describe similar but distinct attitudinal relationships to reality, when it is all-too-easy to presume that everyone is using the word in precisely the same way. But first, we must acknowledge that the word ‘belief’ actually carries two quite distinct meanings.

According to the first definition, it is “a mental conviction of the truth of an idea or some aspect of reality”. Belief in UFOs fits this criterion, as does a belief in gravity and that the sun will rise again tomorrow. How about belief in God? When late in life Jung was asked if he believed in God, he replied straightforwardly “I know”. 14 Others reply with the same degree of conviction if asked about angels, fairies, spirit guides, ghosts or the power of healing and crystals. As a physicist, I believe in the existence of atoms, electrons and quarks – although I’ve never “seen one”, like Jung I know!

So belief in this sense is more often than not grounded in a person’s direct experience/s which obviously doesn’t go to validate the objective truth of their belief. He saw a ghost. She was healed by the touch of a holy man. We ran experiments to measure the charge on an electron. Again, in this sense I have never personally known of anyone who did not believe in the physical reality of a world of solid objects – for who doesn’t believe in the existence of tables and chairs? In this important sense everyone has many convictions about the truth of reality, and we surely all believe in something – this applies even in the case of the most hardline of atheists!

But there is also a second kind of belief: “of an idea that is believed to be true or valid without positive knowledge.” The emphasis here is on the lack of knowledge or indeed of direct experience. So this belief involves an effort of willing on the part of the believer. In many ways, this is to believe in make-believe, or we might just say “to make-believe”; to pretend or wish that something is real: the suspension of disbelief. I believe in unicorns…

As a child, all religion had been utterly mystifying, since what was self-evidently make-believe – for instance a “holy ghost” and the virgin birth! – for reasons I was unable to fathom, would be held by others as sacrosanct. Based on my casual encounters with Christians, it also seemed evident that the harder you tried to make-believe in this maddening mystification of being, the better a person it made you! So here’s the point: when someone tells you they believe in God, is this all they actually mean? That they are trying with tremendous exertion, although little conviction, to make-believe in impossibility?

Indeed, is this striving alone mistaken not only as virtuous but as actual believing in the first sense? Yes, quite possibly – and not only for religious types. Alternatively, it may be that someone truly believes in God – or whatever synonym they choose to approximate to ‘cosmic higher consciousness’ – with the same conviction that all physicists believe in gravity and atoms. They may come to know ‘God’, as Jung did.

Now back to the joke and apologies for killing it: The man complains that he feels like a moth and this is so silly that we automatically presume his condition is entirely one of make-believe. But then the twist, when we learn that his actions correspond to his belief, which means, of course, he has true belief of the first kind. Finally, here’s my hunch then for why we find this funny: it spontaneously reminds us of how true beliefs – rather than make-believe – both inform reality as we perceive it, and fundamentally direct our behaviour. Yet we are always in the process of forgetting altogether that this is how we live too, until abruptly the joke reminds us again – and in our moment of recollecting, spontaneously we laugh.

Which also raises a question: To what extent do beliefs of the second ‘make-believe’ kind determine our behaviour too? Especially when the twin definitions show just how easy it can be to get confused over beliefs. Because as Kurt Vonnegut wrote in the introduction to his cautionary novel Mother Night: “This is the only story of mine whose moral I know”, continuing: “We are what we pretend to be, so we must be careful about what we pretend to be.” 15

*

I would like to return now to an idea I earlier disparaged, Dawkins’s concept ‘memes’: ideas, stories, and other cultural fragments, the development and transmission of which can be considered similar to the mutation and survival of genes. In evoking this concept of memes, Dawkins had hoped to wrest human behaviour apart from the rest of biology in order to present an account of how it came to be that our species alone is capable of surpassing the hardwired instructions encoded in our genes. For Dawkins this entailed some fleeting speculation upon the origins of human culture set out in the final pages of his popular science book, The Selfish Gene. Others later picked up on his idea and have reworked it into a pseudo-scientific discipline known as memetics; something I have already criticised.

In fact, the notion of some kind of evolutionary force actively driving human culture occurred to authors before Dawkins. In The Human Situation, for example, Aldous Huxley outlined his own thoughts on the matter, while already making the significant point that such kinds of “social heredity” must be along Lamarckian rather than Darwinian lines:

“While it is clear that the Lamarckian conception of the inheritance of acquired characteristics is completely unacceptable, and untrue biologically, it is perfectly true on the social, psychological and linguistic level: language does provide us means for taking advantage of the fruits of past experience. There is such a thing as social heredity. The acquisitions of our ancestors are handed down to us through written and spoken language, and we do therefore enjoy the possibility of inheriting acquired characteristics, not through germ plasm but through tradition.”

Like Dawkins, Huxley recognised that culture was the singular feature distinguishing our species from others. Culture on top of nature, dictated by education, religious upbringing, class status, and so forth, establishes the social paradigms according to which individuals in general behave. However, in Huxley’s version, as in Dawkins, this is only metaphorically an evolutionary process, while both evidently regard the process of cultural development as most similar to evolution in one key respect: that it is haphazard.

Indeed, Dawkins and Huxley are similarly keen to stress that human culture is therefore a powerful but ultimately ambiguous force that brings about good and ill alike. As Huxley continues:

“Unfortunately, tradition can hand on bad as well as good items. It can hand on prejudices and superstitions just as effectively as it can hand on science and decent ethical codes. Here again we see the strange ambivalence of this extraordinary gift.” 16

We might carry also these ideas a little further by adding a very important determinant of individual human behaviour which such notions of ‘memetics’ have tended to overlook. For memes are basically ideas, and ideas are, by definition, a product and manifestation of conscious thought and transmission; whereas people, on the other hand, as I have discussed above, often behave in ways that are in conflict with their conscious beliefs and desires, which means to some extent, we act according to mental processes that are beyond or even alien to our immediate understanding.

Acknowledging the influence of the unconscious on our thoughts and behaviours, my contention here is straightforward enough and I think hard to dispute: that just as our conscious minds are moulded and differentiated by local customs and conventions; our unconscious minds are presumably likewise formed and diversified. That, to offer a more concrete example, the Chinese unconscious that was shaped and informed by almost three millennia of Daoism, Buddhism and Confucianism, is likely to be markedly different from the unconscious mind of anyone of us raised within the European tradition. Besides the variations due to religio-philosophical upbringing, divergence is likely to be further compounded due to the wide disparities in our languages, with dissimilarities in all elements from vocabulary, syntax and morphology down to the use of characters rather than letters.

Native tongue (or mother tongue) is a very direct and primary filter that not only channels what we are able to articulate, but governs what we are able to fully conceptualise or even to think at all. 17 It is perfectly conceivable therefore that anyone who learned to communicate first in Mandarin or Cantonese will be unconsciously differentiated from someone who learnt to speak English, Spanish or Arabic instead. 18 Indeed, to a lesser degree perhaps, all who speak English as a first language may have an alternate, if more subtly differentiated unconscious relationship to the world, from those whose mother tongue is say French or German. 19

So now I come back to the idea of memes in an attempt to resurrect it in an altered form. Like Dawkins original proposal, my idea is not rigorous or scientific; it’s another hunch: a way of referencing perhaps slight but characteristic differences in the collective unconscious between nations, tribes and also classes of society. Differences that then manifest perhaps as neuroses and complexes which are entirely planted within specific cultural identities – a British complex, for instance (and certainly we talk of having “an island mentally”). We might say therefore that alongside the transmission of memes, we also need to include the transmission of ‘dremes’ – cultural fragments from our direct social environment that are unconsciously given and received.

*

If this is accepted, then my further contention is that one such dreme has become predominant all around the world, and here I am alluding to what might be christened the ‘American Dreme’. And no, not the “American Dream”, which is different. The American Dream is in fact an excellent example of what Dawkin’s labelled a meme: a cultural notion that on this occasion encapsulates a collection of ideas about how life can and ought to be. It says that life should be better, richer and fuller for everyone. Indeed, it is written indelibly into the American constitution in the wonderful phrase: “Life, Liberty and the pursuit of Happiness.” Because the American Dream is inspiring and has no doubt been tremendous liberation for many; engendering technological progress and motivating millions with hopes that anyone living in “The Land of Opportunity” “can make it” “from rags to riches” – all subordinate memes to encapsulate different aspects of the fuller American Dream.

E pluribus unum – “Out of many one” – is the motto inscribed on the scroll held so firmly by the beak of the bald eagle on the Seal of the United States. 20  Again, it is another sub-meme at the heart of the American Dream meme: an emblematic call for an unbound union between the individual and collective; inspiring a loose harmony poetically compared to the relationship of flowers in a bouquet – thus, not a mixing-pot, but a richer mosaic that maintains the original diversity.

Underlying this American Dream, a related sub-meme, cherishes “rugged individualism”. The aspiration of individuals, not always pulling together, nor necessarily in one direction, but constantly striving upwards: pulling themselves up by their own bootstraps! Why? Because according to the dream at least, if you try hard enough, then you must succeed. And though this figurative pulling yourself up by your own bootstraps involves a physical impossibility that contravenes Newton’s Laws, even this does not detract from the idea. Believers in the American Dream apparently don’t notice any contradiction, despite the fantastical image of their central metaphor. The dream is buoyed so high on hope, when deep down most know it’s actually a fairy tale.

So finally there is desperation and a sickliness about the American Dream. A harsh reality in which “The Land of Opportunity” turns out to be a steep-sided pyramid spanned by labyrinthine avenues that mostly run to dead-ends. A promised land but one riven by chasms as vast as the Grand Canyon; disparities that grew out of historical failures: insurmountable gulfs in wealth and real opportunity across a population always beset by class and racial inequalities. Indeed, the underclass of modern America is no less stuck within societal ruts than the underclass of the least developed regions on earth, and in relative terms many are worse off. 21 “It’s called the American Dream”, said the late, great satirist George Carlin, “because you have to be asleep to believe it”.

In short, to keep dreaming the American Dream involves an unresting commitment. Its most fervent acolytes live in a perpetually suspended state of ignorance or outright denial; denial of the everyday miseries and cruelties that ordinary Americans daily suffer: the ‘American Reality’.

Graphic from page 56 of Jean Kilbourne’s book Can’t Buy My Love: How Advertising Changes the Way We Think and Feel (originally published in hardcover in 1999 as Deadly Persuasion: ‘Why Women and Girls Must Fight the Addictive Power of Advertising’). It was an ad for a German marketing firm, contained within a decades-old issue of the trade journal ‘Advertising Age’:

3 being humans image -MAGA advert

But just suppose for a moment that the American Dream actually did come true. That America somehow escaped from this lingering malaise and blossomed into a land of real freedom and opportunity for all as it always promised to be. Yet still an unassailable problem remains. For as with every ascent, the higher you reach the more precarious your position becomes: as apes we have never entirely forgotten how branches are thinner and fewest at the top of the tree.

Moreover, built into the American Dream is its emphasis on material enrichment: to rise towards the heavens therefore means riding up and up and always on a mountain of stuff. And, as you rise, others must, in relative terms, fall. Not necessarily because there isn’t enough stuff to go around, but because success depends upon holding ownership of the greatest share. Which means that as the American Reality draws closer to the American Dream (and it could hardly get much further away), creating optimal social mobility and realisable opportunities for all, then even given this best of all circumstances, the rise of some at the expense of others will cultivate anxious winners and a disadvantaged underclass for whom relative material gain of the winners comes at their own cost of bearing the stigma of comparative failure.

Why am I not nearer the top of the tree? In the greatest land on earth, why do I remain subservient to the gilded elites? Worries that nowadays plague the insomniac hours of many a hopeful loser; of those who landed up, to a large extent by accidental circumstance, in the all-too-fixed trailer parks of “The Land of the Free” (yet another sub-meme – ironically linked to the country with the highest incarceration rate on earth).

But worse, there is an inevitable shadow cast by the American Dream: a growing spectre of alienation and narcissism that abounds from such excessive emphasis on individual achievement: feelings of inferiority for those who missed the boat, and superiority, for those who caught the gravy train. Manipulation is celebrated. Machiavellianism, narcissism and psychopathy come to reign. This shadow is part of what we might call the ‘American Dreme’; an unconscious offspring that contains within it a truly abysmal contrast to the American Dream which bore it. A dreme, that being carried upon the coat-tails of the Dream, was spread far and wide by Hollywood, by Disney, radiated out in radio and television transmissions, and in consequence is now becoming the ‘Global Dreme’.

Being unconscious of it, however, we are mostly unaware of any affliction whatsoever; the dreme being insidious, and thus very much more dangerous than the meme. We might even mistake it for something else – having become such a pandemic, we might easily misdiagnose it as a normal part of ‘human nature’.

*

Here is Chris Hedges again with his own analysis of modern day consumerism, totalitarian corporate power and living in a culture dominated by pervasive illusion:

“Working for the American Dream”, first broadcast by the BBC in July 2018 and embedded below, is American comedian Rich Hall’s affectionate though characteristically sardonic portrait of the nation’s foundational and persistent myth:

*

And the joke was hilarious wasn’t it? No, you didn’t like it….? Well, if beauty is in the eye of the beholder, comedy surely lies in the marrow of the funny bone! Which brings me to ask why there is comedy? More broadly, why is there laughter – surely the most curious human reflex of all – or its very closely-related reflex cousin, crying. In fact, the emission of tears from the nasolacrimal ducts other than in response to irritation of our ocular structures and purely for reasons of joy or sorrow is a very nearly uniquely human secretomotor phenomenon. (Excuse my Latin!) 22

The jury is still out on evolutionary function of laughing and crying, but when considered in strictly Darwinian terms (as current Science insists), it is hard to fathom why these dangerously debilitating and a potentially life threatening responses ever developed in any species. It is acknowledged indeed that a handful of unlucky (perhaps lucky?) people have literally died from laughter. So why do we laugh? Why do we love laughter, whether ours or others, so much? Your guess is as good as mine, and, more importantly, as good as Darwin’s:

Many curious discussions have been written on the causes of laughter with grown-up persons. The subject is extremely complex. Something incongruous or unaccountable, exciting surprise and some sense of superiority in the laugher, who must be in a happy frame of mind, seems to be the commonest cause. 23

Less generously, Thomas Hobbes, who explained all human behaviour in terms of gaining social advantage, wrote that:

Joy, arising from imagination of a man’s own power and ability, is that exultation of the mind which is called glorying… Sudden Glory, is the passion which maketh those grimaces called LAUGHTER; and is caused either by some sudden act of their own, that pleaseth them; or by the apprehension of some deformed thing in another, by comparison whereof they suddenly applaud themselves. 24

And indeed, it is true that a great deal of laughter is at the expense of some butt of our joking, however not all mockery involves an inflicted party and there’s a great deal more to humour and laughter than merely ridicule and contempt. So Hobbes’ account is at best a very desiccated postulation for why humans laugh, let alone what constitutes joy.

Indeed, Hobbes’ reductionism is evidently mistaken and misinformed not only by his deep-seated misanthropy, but also by a seeming lack of common insight which leads one to suspect that when it came to sharing any jokes, he just didn’t get it. But precisely what didn’t he get?

Well, apparently he didn’t get how laughter can be a straightforward expression of joie de vivre. Too French I imagine! Or that when we apprehend anything, this momentarily snaps us from a prior state of inattention and on the occasion of and finding amusement in an abrupt, often fleeting, but totally fresh understanding, the revelation itself may elicit laughter (as I already outlined above). Or that it is simply impossible to laugh authentically or infectiously unless you not only understand the joke, but fully acknowledge it. In this way, humour, if confessional, can be liberating at a deeply personal level, or if satirical, liberating at a penetrating societal level. Lastly (in my necessarily limited rundown), humour serves as a wonderfully efficient and entertaining springboard for communicating insight and understanding, especially when the truths are dry, difficult to grasp or otherwise unpalatable. Here is a rhetorical economy that Hobbes might actually have approved were it not for his somewhat curmudgeonly disposition.

And why tell a joke here? Just to make you laugh and take your mind off the gravity of the topics covered and still more grave ones to come? To an extent, yes, but also to broaden out our discussion, letting it drift off into related philosophical avenues. For existence is seemingly absurd, is it not? Considered squarely, full-frontal, what’s it all about…? And jokes – especially ones that work beyond rational understanding – offer a playful recognition of the nonsensicalness of existence and of our species’ farcical determination to comprehend it and ourselves fully. What gives us the gall to ever speculate on the meaning of life, the universe and everything?

Meanwhile, we are free to choose: do we laugh or do we cry at our weird predicament. Both responses are surely sounder than cool insouciance, since both are flushed with blood. And were we madder, we might scream instead, of course, whether in joy or terror. As Theseus says in Shakespeare’s A Midsummer Night’s Dream:

Lovers and madmen have such seething brains,
Such shaping fantasies, that apprehend
More than cool reason ever comprehends.
The lunatic, the lover, and the poet
Are of imagination all compact.

*

French existentialist Albert Camus famously made the claim: “There is but one truly serious philosophical problem and that is suicide.” 25 Camus was not an advocate of suicide, of course; far from it. In fact, he saw it as perfectly vain attempt to flee from the inescapable absurdity of life, something he believed we ought to embrace in order to live authentically. Indeed, Camus regarded every attempt to deny the primacy of ultimate meaninglessness of life in a universe that is indifferent to our suffering as a surrogate form of psychological suicide.

But rather than staring blankly into the abyss, Camus urges us to rebel against it. To face its absurdity without flinching, and through rebellion, by virtue of which we individually reconstruct the meaning of our lives afresh, albeit paradoxically, we shall then come to face extreme rationality. Although perhaps he goes too far, and reaches a point so extreme that few can follow: such a Sisyphean outlook being too desolate for most of us, and his exhortation to authenticity so impassioned that it seems almost infinitely taxing. 26 Kierkegaard’s “leap of faith” is arguably more forgiving of the human condition – but enough of philosophy. 27

This pause is meant for introspection. I have therefore presented an opportunity to reconsider how my interlude set out, not only by telling a joke – and hopefully one that made you smile if not laugh out loud – but also to reflect upon the beautiful wisdom encapsulated in Chuang Tzu’s dream of becoming a butterfly; mystical enlightenment from 4th century BC China, that clashes intentionally with the plain silliness of a doctor-doctor joke about a moth-man; a surreal quip about clinical diagnosis and psychiatry (something I shall be coming to consider next).

However, the running theme here is one of transformation, and at the risk of also killing Chuang Tzu’s message by dissection, I will simply add (unnecessarily from the Daoist perspective) that existence does appear to be cyclically transformative; on personal, collective and altogether cosmic levels, the conscious and unconscious, spiralling outwards – whether upward into light or downward into darkness – each perpetually giving rise to the other just like the everblooming of yang and yin. As maverick clinical psychiatrist R. D. Laing once wrote:

“Most people most of the time experience themselves and others in one way or another that I… call egoic. That is, centrally or peripherally, they experience the world and themselves in terms of a consistent identity, a me-here over against you-there, within a framework of certain ground structures of space and time shared with other members of their society… All religious and all existential philosophies have agreed that such egoic experience is a preliminary illusion, a veil, a film of maya—a dream to Heraclitus, and to Lao Tzu, the fundamental illusion of all Buddhism, a state of sleep, of death, of socially accepted madness, a womb state to which one has to die, from which one has to be born.” 28

Returning from the shadowlands of alienation to contemplate the glinting iridescent radiance of Tzu’s butterfly’s wings is an invitation to scrape away the dross of habituated semi-consciousness that veils the playful mystery of our minds. On a different occasion, Tzu wrote:

One who dreams of drinking wine may in the morning weep; one who dreams weeping may in the morning go out to hunt. During our dreams we do not know we are dreaming. We may even dream of interpreting a dream. Only on waking do we know it was a dream. Only after the great awakening will we realize that this is the great dream. And yet fools think they are awake, presuming to know that they are rulers or herdsmen. How dense! You and Confucius are both dreaming, and I who say you are a dream am also a dream. Such is my tale. It will probably be called preposterous, but after ten thousand generations there may be a great sage who will be able to explain it, a trivial interval equivalent to the passage from morning to night. 29

Thus the world about us is scarcely less a construct of our imagination than our dreams are, deconstructed by the senses then seamlessly reconstructed in its entirety. And not just reconfigured via inputs from the celebrated five gateways of vision, sound, touch, taste and smell, but all portals including those of memory, intuition, and even reason. After all, it is curious how we speak of having ‘a sense’ of reason, just as we do ‘a sense’ of humour. Well, do we have… a sense of reason and a sense of humour? If you have followed this far then I sense you may share my own.

Next chapter…

*

Richard Rohr is a Franciscan priest, author and teacher, who says that his calling has been “to retrieve and re-teach the wisdom that has been lost, ignored or misunderstood in the Judeo-Christian tradition.” Rohr is the founder of the Center for Action and Contemplation and academic dean of the CAC’s Living School, where he practises incarnational mysticism, non-dual consciousness, and contemplation, with a particular emphasis on how these affect the social justice issues of our time. Recently he shared his inspirational standpoint in an hour-long chat with ‘Buddha at the Gas Pump’s Rick Archer:

*

Addendum: anyone with half a brain

“The intuitive mind is a sacred gift and the rational mind is a faithful servant. We have created a society that honors the servant and has forgotten the gift.”

— attributed to Albert Einstein 30

*

The development of split-brain operations for the treatment of severe cases of epilepsy, which involves the severing of the corpus callosum, a thick web of nerves that allow communication between the two hemispheres, first drew attention to how left and right hemispheres have quite different attributes. Unfortunately, the early studies in this field produced erroneous since superficial notions about left and right brain functions that were in turn vulgarised and popularised when they percolated down into pop psychology and management theory. The left brain was said to generate language and logic; while it was only the right brain which supposedly dealt with feelings and was the creative centre. In reality, both hemispheres are involved in all aspects of cognition, and as a consequence the study of what is technically called the lateralisation of brain function fell to some extent into academic disrepute.

In fact, important differences do occur between the specialism of the left and right hemispheres, although as psychiatrist Iain McGilchrist proposes in this book The Master and His Emissary (which he sees as the proper roles of the right and left hemispheres respectively) 31, it is often better to understand the distinctions in terms of where conscious awareness is placed. In summary, the left hemisphere attends to and focuses narrowly but precisely on what is immediately in front of you, allowing you to strike the nail with the hammer, thread the eye of the needle, sort the wheat from the chaff (or whatever activity you might be actively engaged with), while the right hemisphere remains highly vigilant and attentive to the surroundings. Thus, the left brain operates tools and usefully sizes up situations, while the right brain’s immediate relationship to the environment and to our bodies makes it the mediator to social activities and to a far broader conscious awareness. However, according to McGilchrist, the left brain is also convinced of its primacy, whereas the right is incapable of comprehending such hierarchies, which is arguably the root of a problem we all face, since it repeatedly leads humans to construct societal arrangements and norms in accordance with left brain dominance and so to the inevitable detriment of less restricted right brain awareness.

Supported by many decades of research, this has become the informed view of McGilchrist, and given that his overarching thesis has merit – note that the basic distinctions between left and right brain awareness are uncontroversial and well understood in psychology, whereas what he sees as the socio-historical repercussions is more speculative – then it raises brain function lateralisation as major underlying issue that needs to be incorporated in any final appraisal of ‘human nature’, the implications of which McGilchrist propounds at length in his own writing. In the preface to the new expanded edition of The Master and His Emissary (2009), he writes:

I don’t want it to be possible, after reading this book, for any intelligent person ever again to see the right hemisphere as the ‘minor’ hemisphere, as it used to be called – still worse the flighty, impetuous, fantastical one, the unreliable but perhaps fluffy and cuddly one – and the left hemisphere as the solid, dependable, down-to-earth hemisphere, the one that does all the heavy lifting and is alone the intelligent source of our understanding. I might still be to some extent swimming against the current, but there are signs that the current may be changing direction.

*

Embedded below is a lecture given to the Royal Society of Arts (RSA) in 2010, in which he offers a concise overview of how according to our current understanding, the ‘divided brain’ has profoundly altered human behaviour, culture and society:

To hear these ideas contextualised within an evolutionary account of brain laterality, I also recommend a lecture given to The Evolutionary Psychiatry Special Interest Group of the Royal College of Psychiatrists in London (EPSIG UK) in 2018:

For more from Iain McGilchrist I also recommend this extended interview with physicist and filmmaker Curt Jaimungal, host of Theories of Everything, which premiered on March 29th:

*

Please note that for the purposes of ‘publishing’ here I have taken advantage of the option to incorporate hypertext links and embed videos – in order to distinguish additional commentary from the original text all newly incorporated text has been italised.

*

1  “insensible perceptions are as important to [the science of minds, souls, and soul-like substances] as insensible corpuscles are to natural science, and it is just as unreasonable to reject the one as the other on the pretext that they are beyond the reach of our senses.” from Preface of New Essays concerning Human Understanding  by Gottfried Leibnitz, first published in 1704, translation courtesy of Stanford Encyclopedia of Philosophy.

2 From Anthropology from a Pragmatic Point of View by Immanuel Kant, first published in 1798.

3 “The definition of Psychology may be best given… as the description and explanation of states of consciousness as such. By states of consciousness are meant such things as sensations, desires, emotions, cognitions, reasonings, decisions, volitions, and the like. Their ‘explanation’ must of course include the study of their causes, conditions, and immediate consequences, so far as these can be ascertained.” from opening paragraph of “Introduction: Body and Mind” from The Principles of Psychology, by William James, first published in 1892.

4 Extract taken from The Varieties of Religious Experience, from chapter on “The Sick Soul”.

5 Letter to his friend, Francis Child.

6 According to James, the first division of “the self” that can be discriminated is between “the self as known”, the me, and “the  self as knower”, the I, or “pure ego”. The me he then suggests might be sub-divided in a constituent hierarchy: “the material me” at the lowest level, then “the social me” and top-most “the spiritual me”. It was not until very much later in the 1920s when Freud had fully developed his own tripartite division of the psyche in id, ego and super-ego, a division that surely owes much to James.

7

In the spring of 1876, a young man of nineteen arrived in the seaside city of Trieste and set about a curious task. Every morning, as the fishermen brought in their catch, he went to meet them at the port, where he bought eels by the dozens and then the hundreds. He carried them home, to a dissection table in a corner of his room, and—from eight until noon, when he broke for lunch, and then again from one until six, when he quit for the day and went to ogle the women of Trieste on the street—he diligently slashed away, in search of gonads.

“My hands are stained by the white and red blood of the sea creatures,” he wrote to a friend. “All I see when I close my eyes is the shimmering dead tissue, which haunts my dreams, and all I can think about are the big questions, the ones that go hand in hand with testicles and ovaries—the universal, pivotal questions.”

The young man, whose name was Sigmund Freud, eventually followed his evolving questions in other directions. But in Trieste, elbow-deep in slime, he hoped to be the first person to find what men of science had been seeking for thousands of years: the testicles of an eel. To see them would be to begin to solve a profound mystery, one that had stumped Aristotle and countless successors throughout the history of natural science: Where do eels come from?

From an article entitled “Where Do Eels Come From?” written by Brooke Jarvis, published in New Yorker magazine on May 18, 2020. https://www.newyorker.com/magazine/2020/05/25/where-do-eels-come-from

8 In the BBC TV sci-fi comedy Red Dwarf (Series 1 Episode), the eponymous characters “Confidence and Paranoia” form an alternative superego-id partnership, existing as physical manifestations, which appear onboard as symptoms of Lister’s illness.

9 Fixing on specific erogenous zones of the body, Freud believed that libidinous desire shaped our psychological development in a very specific fashion, naturally progressing, if permitted, through early stages from oral, to anal, and, then reaching adulthood, to genital.

10 Jocasta, the queen of Thebes, is barren, and so she and her husband, the king Laius, decide to consult the Oracle of Delphi. The Oracle tells them that if Jocasta bears a son, then the son will kill his father and marry her. Later, when Jocasta does indeed have a son, Laius demands that a servant take the baby to a mountain to be abandoned, his ankles pinned together just in case. But Oracles are rarely mistaken, fate is hard to avoid, and so as it happens the servant spares the infant, giving him to a shepherd instead. Eventually, as fortune will have it, the infant is adopted by the king and queen of Corinth, and named Oedipus because of the swellings on his feet. Years pass. Then, one day Oedipus learns that the king and queen are not his parents, but when he asks them, they deny the truth. So Oedipus decides put the question to the Oracle of Delphi instead, who being an enigmatic type, refuses to identify his true parents, but foretells his future instead, saying that he is destined to kill his father and marry his mother. Determined to avoid this, Oedipus determines not to return home to Corinth, heading to, you guessed it, Thebes instead. He comes to an intersection of three roads and meets Laius driving a chariot. They argue about who has the right of way and then, in an early example of road rage, their rage spills into a fight and thus Oedipus unwittingly kills his real father. Next up, he meets the sphinx, who asks its famous riddle. This is a question of life and death, all who have incorrectly answered having been killed and eaten, but Oedipus gets the answer right and so obligingly the sphinx kills itself instead. Having freed the people of Thebes from the sphinx, Oedipus receives the hand of the recently widowed Jocasta in marriage. All is well for a while, but then it comes to pass that Jocasta learns who Oedipus really is, and hangs herself. Then, later again, Oedipus discovers that he was the murderer of his own father, and gouges his own eyes out.

11 Sigmund Freud, The Interpretation of Dreams, chapter V, “The Material and Sources of Dreams”

12 From an essay by C.G. Jung published in CW XI, Para 520. The word ‘Raca’ is an insult translated as ‘worthless’ or ‘empty’ taken from a passage in the Sermon on the Mount from Matthew 5:22.

13 Jung described the shadow in a key passage as “that hidden, repressed, for the most part inferior and guilt-laden personality whose ultimate ramifications reach back into the realm of our animal ancestors…If it has been believed hitherto that the human shadow was the source of evil, it can now be ascertained on closer investigation that the unconscious man, that is his shadow does not consist only of morally reprehensible tendencies, but also displays a number of good qualities, such as normal instincts, appropriate reactions, realistic insights, creative impulses etc”

From Jung’s Collected Works, 9, part 2, paragraph 422–3.

14 In response to a question in an interview completed just two years before his death by John Freeman and broadcast as part of the BBC Face to Face TV series in 1959. Asking about his childhood and whether he had to attend church, he then asked: “Do you now believe in God?” Jung replies: “Now? Difficult to answer… I know. I don’t need to believe I know.”

15 The quote in full reads: “This is the only story of mine whose moral I know. I don’t think it’s a marvelous moral, I just happen to know what it is: We are what we pretend to be, so we must be careful about what we pretend to be.” From Mother Night (1962) by Kurt Vonnegut.

16 The Human Situation is a collection of lectures first delivered by Aldous Huxley at the University of California in 1959. These were edited by Piero Ferrucci and first published in 1978 by Chatto & Windus, London. Both extracts here were taken from his lecture on “Language”, p 172.

17 This is the premise behind Orwell’s ‘Newspeak’ used in his dystopian novel Nineteen Eighty-Four. In Chapter 5, Syme, a language specialist and one of Winston Smith’s colleagues at the Ministry of Truth, explains enthusiastically to Winston:

“Don’t you see that the whole aim of Newspeak is to narrow the range of thought? In the end we shall make thoughtcrime literally impossible, because there will be no words in which to express it. Every concept that can ever be needed, will be expressed by exactly one word, with its meaning rigidly defined and all its subsidiary meanings rubbed out and forgotten.”

18 I should note that the idea proposed here is not altogether original and that the original concept of ‘linguistic relativity’ is jointly credited to linguists Edward Sapir and Benjamin Whorf who whilst working independently came to the parallel conclusion that (in the strong form) language determines thought or (in the weak form) language and its usage influences thought. Whorf also inadvertently created the urban myth that Eskimos have hundred words for snow after he wrote in a popular article “We [English speakers] have the same word for falling snow, snow on the ground, snow hard packed like ice, slushy snow, wind-driven snow – whatever the situation may be. To an Eskimo, this all-inclusive word would be almost unthinkable…” The so-called “Sapir-Whorf hypothesis” continues to inspire research in psychology, anthropology and philosophy.

19 After writing this, I then read Richard Dawkins The Ancestor’s Tale. Aside from being a most wonderful account of what Dawkins poetically describes as his ‘pilgrimage to the dawn of life’, here Dawkins also returns to many earlier themes of other books, occasionally moderating or further elucidating previous thoughts and ideas. In chapter entitled ‘the peacock’s tale’ [pp 278–280], he returns to speculate more about the role memes may have had on human development. In doing so he presents an idea put forward by his friend, the philosopher Daniel Dennett,  from his book “Consciousness Explained”, which is that local variation of memes is inevitable:

“The haven all memes depend on reaching is the human mind, but the human mind is itself an artifact created when memes restructure a human brain in order to make it a better habitat for memes. The avenues for entry and departure are modified to suit local conditions, and strengthened by various artificial devices that enhance fidelity and prolixity of replication: native Chinese minds differ dramatically from native French minds, and literate minds differ from illiterate minds.” And is it not also implicit here, that the unconscious brain will also be differently ‘restructured’ due to different environmental influences.

20 Barack Obama, who’s own election was acclaimed by some and witnessed by many as proof of the American Dream, recently compared E pluribus unuman Indonisian motto Bhinneka Tunggal Ika — unity in diversity.

“But I believe that the history of both America and Indonesia should give us hope. It is a story written into our national mottos. In the United States, our motto is E pluribus unum — out of many, one. Bhinneka Tunggal Ika — unity in diversity. (Applause.) We are two nations, which have traveled different paths. Yet our nations show that hundreds of millions who hold different beliefs can be united in freedom under one flag.” Press release (unedited) from The White House, posted November 10th, 2010: “remarks by the President at the University of Indonesia in Jakarta, Indonesia”

21 Summary of statistical analysis by the Center for American Progress, “Understanding Mobility in America”, by Tom Hertz, American University, published April 26th, 2006. Amongst the key findings was a discovery that “Children from low-income families have only a 1 percent chance of reaching the top 5 percent of the income distribution, versus children of the rich who have about a 22 percent chance [of remaining rich].” and that “By international standards, the United States has an unusually low level of intergenerational mobility: our parents’ income is highly predictive of our income as adults.” The report adds that “Intergenerational mobility in the United States is lower than in France, Germany, Sweden, Canada, Finland, Norway and Denmark. Among high-income countries for which comparable estimates are available, only the United Kingdom had a lower rate of mobility than the United States.”

Reproduced from an article entitled “Advertising vs. Democracy: An Interview with Jean Kilbourne” written by Hugh Iglarsh, published in Counterpunch magazine on October 23rd 2020. https://www.counterpunch.org/2020/10/23/advertising-vs-democracy-an-interview-with-jean-kilbourne/ 

22 In his follow-up to the more famous On the Origin of Species (1959) and The Descent of Man (1871), Charles Darwin reported in Chapter VI entitled “Special Expressions of Man: Suffering and Weeping” of his three major work The Expression of the Emotions in Man and Animals (1872), that:

I was anxious to ascertain whether there existed in any of the lower animals a similar relation between the contraction of the orbicular muscles during violent expiration and the secretion of tears; but there are very few animals which contract these muscles in a prolonged manner, or which shed tears. The Macacus maurus, which formerly wept so copiously in the Zoological Gardens, would have been a fine case for observation; but the two monkeys now there, and which are believed to belong to the same species, do not weep. Nevertheless they were carefully observed by Mr. Bartlett and myself, whilst screaming loudly, and they seemed to contract these muscles; but they moved about their cages so rapidly, that it was difficult to observe with certainty. No other monkey, as far as I have been able to ascertain, contracts its orbicular muscles whilst screaming.

The Indian elephant is known sometimes to weep. Sir E. Tennent, in describing these which he saw captured and bound in Ceylon, says, some “lay motionless on the ground, with no other indication of suffering than the tears which suffused their eyes and flowed incessantly.” Speaking of another elephant he says, “When overpowered and made fast, his grief was most affecting; his violence sank to utter prostration, and he lay on the ground, uttering choking cries, with tears trickling down his cheeks.” In the Zoological Gardens the keeper of the Indian elephants positively asserts that he has several times seen tears rolling down the face of the old female, when distressed by the removal of the young one. Hence I was extremely anxious to ascertain, as an extension of the relation between the contraction of the orbicular muscles and the shedding of tears in man, whether elephants when screaming or trumpeting loudly contract these muscles. At Mr. Bartlett’s desire the keeper ordered the old and the young elephant to trumpet; and we repeatedly saw in both animals that, just as the trumpeting began, the orbicular muscles, especially the lower ones, were distinctly contracted. On a subsequent occasion the keeper made the old elephant trumpet much more loudly, and invariably both the upper and lower orbicular muscles were strongly contracted, and now in an equal degree. It is a singular fact that the African elephant, which, however, is so different from the Indian species that it is placed by some naturalists in a distinct sub-genus, when made on two occasions to trumpet loudly, exhibited no trace of the contraction of the orbicular muscles.

The full text is uploaded here: https://www.gutenberg.org/files/1227/1227-h/1227-h.htm#link2HCH0006

23 Quote from The Expression of the Emotions in Man and Animals (1872), Chapter VIII “Joy, High Spirits, Love, Tender Feelings, Devotion” by Charles Darwin. He continues:

The circumstances must not be of a momentous nature: no poor man would laugh or smile on suddenly hearing that a large fortune had been bequeathed to him. If the mind is strongly excited by pleasurable feelings, and any little unexpected event or thought occurs, then, as Mr. Herbert Spencer remarks, “a large amount of nervous energy, instead of being allowed to expend itself in producing an equivalent amount of the new thoughts and emotion which were nascent, is suddenly checked in its flow.” . . . “The excess must discharge itself in some other direction, and there results an efflux through the motor nerves to various classes of the muscles, producing the half-convulsive actions we term laughter.” An observation, bearing on this point, was made by a correspondent during the recent siege of Paris, namely, that the German soldiers, after strong excitement from exposure to extreme danger, were particularly apt to burst out into loud laughter at the smallest joke. So again when young children are just beginning to cry, an unexpected event will sometimes suddenly turn their crying into laughter, which apparently serves equally well to expend their superfluous nervous energy.

The imagination is sometimes said to be tickled by a ludicrous idea; and this so-called tickling of the mind is curiously analogous with that of the body. Every one knows how immoderately children laugh, and how their whole bodies are convulsed when they are tickled. The anthropoid apes, as we have seen, likewise utter a reiterated sound, corresponding with our laughter, when they are tickled, especially under the armpits… Yet laughter from a ludicrous idea, though involuntary, cannot be called a strictly reflex action. In this case, and in that of laughter from being tickled, the mind must be in a pleasurable condition; a young child, if tickled by a strange man, would scream from fear…. From the fact that a child can hardly tickle itself, or in a much less degree than when tickled by another  person, it seems that the precise point to be touched must not be known; so with the mind, something unexpected – a novel or incongruous idea which breaks through an habitual train of thought – appears to be a strong element in the ludicrous.

24 Quote from, Leviathan (1651), The First Part, Chapter 6, by Thomas Hobbes (with italics and spelling as original). Hobbes continues:

And it is incident most to them, that are conscious of the fewest abilities in themselves; who are forced to keep themselves in their own favour, by observing the imperfections of other men. And therefore much Laughter at the defects of others is a signe of Pusillanimity. For of great minds, one of the proper workes is, to help and free others from scorn; and compare themselves onely with the most able.

Interestingly, Hobbes then immediately offers his account of weeping as follows:

On the contrary, Sudden Dejection is the passion that causeth WEEPING; and is caused by such accidents, as suddenly take away some vehement hope, or some prop of their power: and they are most subject to it, that rely principally on helps externall, such as are Women, and Children. Therefore, some Weep for the loss of Friends; Others for their unkindnesse; others for the sudden stop made to their thoughts of revenge, by Reconciliation. But in all cases, both Laughter and Weeping, are sudden motions; Custome taking them both away. For no man Laughs at old jests; or Weeps for an old calamity.

https://www.gutenberg.org/files/3207/3207-h/3207-h.htm#link2H_PART1

25 “Il n’y a qu’un problème philosophique vraiment sérieux : c’est le suicide.” Quote taken from The Myth of Sisyphus (1942) by Camus, Albert.  Translated by Justin O’Brien.

26 In Greek mythology Sisyphus was punished in hell by being forced to roll a huge boulder up a hill only for it to roll down every time, repeating his action for eternity. In his philosophical essay The Myth of Sisyphus (1942) Camus compares this unremitting and unrewarding task of Sisyphus to the lives of ordinary people in the modern world, writing:

“The workman of today works every day in his life at the same tasks, and this fate is no less absurd. But it is tragic only at the rare moments when it becomes conscious.”

In sympathy he also muses on Sisyphus’ thoughts especially as he trudges in despair back down the mountain to collect the rock again. He writes:

“You have already grasped that Sisyphus is the absurd hero. He is, as much through his passions as through his torture. His scorn of the gods, his hatred of death, and his passion for life won him that unspeakable penalty in which the whole being is exerted toward accomplishing nothing. This is the price that must be paid for the passions of this earth. Nothing is told us about Sisyphus in the underworld. Myths are made for the imagination to breathe life into them.”

Continuing:

“It is during that return, that pause, that Sisyphus interests me. A face that toils so close to stones is already stone itself! I see that man going back down with a heavy yet measured step toward the torment of which he will never know the end. That hour like a breathing-space which returns as surely as his suffering, that is the hour of consciousness. At each of those moments when he leaves the heights and gradually sinks toward the lairs of the gods, he is superior to his fate. He is stronger than his rock.

“If this myth is tragic, that is because its hero is conscious. Where would his torture be, indeed, if at every step the hope of succeeding upheld him? The workman of today works everyday in his life at the same tasks, and his fate is no less absurd. But it is tragic only at the rare moments when it becomes conscious. Sisyphus, proletarian of the gods, powerless and rebellious, knows the whole extent of his wretched condition: it is what he thinks of during his descent. The lucidity that was to constitute his torture at the same time crowns his victory. There is no fate that can not be surmounted by scorn.”

You can read the extended passage here: http://dbanach.com/sisyphus.htm

27 Søren Kierkegaard never actually coined the term “leap of faith” although he did use the more general notion of “leap” to describe situations whenever a person is faced with a choice that cannot be fully justified rationally. Moreover, in this instance the “leap” is perhaps better described as a leap “towards” or “into” faith that finally overcomes what Kierkegaard saw as an inherent paradoxical contradiction between the ethical and the religious. However, Kierkegaard never advocates “blind faith”, but instead recognises that faith ultimately calls for action in the face of absurdity.

In Part Two, “The Subjective Issue”, of his 1846 work and impassioned attack against Hegelianism, Concluding Unscientific Postscript to the Philosophical Fragments (Danish: Afsluttende uvidenskabelig Efterskrift til de philosophiske Smuler), which is known for its dictum, “Subjectivity is Truth”, Kierkegaard wrote:

“When someone is to leap he must certainly do it alone and also be alone in properly understanding that it is an impossibility… the leap is the decision… I am charging the individual in question with not willing to stop the infinity of [self-]reflection. Am I requiring something of him, then? But on the other hand, in a genuinely speculative way, I assume that reflection stops of its own accord. Why, then, do I require something of him? And what do I require of him? I require a resolution.”

28 R. D. Laing, The Politics of Experience  (Ballantine Books, N.Y., 1967)

29 Quoted from the book known as Zhuangzi (also transliterated as Chuang Tzu or Chuang Chou). Translation by Lin Yutang

30 Although in all likelihood a reworking of a passage from a book entitled The Metaphoric Mind: A Celebration of Creative Consciousness written by Bob Samples and published in 1976 in which the fuller passage reads [with emphasis added]:

“The metaphoric mind is a maverick. It is as wild and unruly as a child. It follows us doggedly and plagues us with its presence as we wander the contrived corridors of rationality. It is a metaphoric link with the unknown called religion that causes us to build cathedrals — and the very cathedrals are built with rational, logical plans. When some personal crisis or the bewildering chaos of everyday life closes in on us, we often rush to worship the rationally-planned cathedral and ignore the religion. Albert Einstein called the intuitive or metaphoric mind a sacred gift. He added that the rational mind was a faithful servant. It is paradoxical that in the context of modern life we have begun to worship the servant and defile the divine.

31 The book is subtitled The Divided Brain and the Making of the Western World

4 Comments

Filed under « finishing the rat race »

the price of everything

The following article is Chapter Nine of a book entitled Finishing The Rat Race which I am posting chapter by chapter throughout this year. Since blog posts are stacked in a reverse time sequence (always with the latest at the top), I have decided that the best approach is to post the chapters in reverse order.

All previously uploaded chapters are available (in sequence) by following the link above or from category link in the main menu, where you will also find a brief introductory article about the book itself and why I started writing it.

*

When the accumulation of wealth is no longer of high social importance, there will be great changes in the code of morals. We shall be able to rid ourselves of many of the pseudo-moral principles which have hag-ridden us for two hundred years, by which we have exalted some of the most distasteful of human qualities into the position of the highest virtues. We shall be able to afford to dare to assess the money-motive at its true value. The love of money as a possession — as distinguished from the love of money as a means to the enjoyments and realities of life — will be recognised for what it is, a somewhat disgusting morbidity, one of those semi-criminal, semi-pathological propensities which one hands over with a shudder to the specialists in mental disease…”

John Maynard Keynes 1

*

Have you ever wondered what it’s like to be rich? Here I don’t just mean well-off, with a paltry few tens of millions in the bank, I mean proper rich – megabucks! So much money that, as I heard one comedian put it (aiming his joke squarely at the world’s richest entrepreneur), if Bill Gates were to stuff all his cash under the mattress, then due to interest alone, if he fell out of bed he’d never hit the ground!

I suppose what I’m wondering is this – and perhaps you’ve found yourself thinking along similar lines – why are these super-rich guys always so intent on accruing ever greater wealth when they already possess more than enough funds to guarantee the needs of a small country. Think about it this way: Gates and the others are, barring a few very necessary legal constraints, completely at liberty to do whatever they choose at every moment of every day. They can eat the best food, drink the most delicious vintage wines, smoke the finest cigars, play golf morning, noon, and evening, and then after the sun goes down, and if it is their wont, have liaison with the most voluptuous women (or men) available. Quite literally, they have means to go anywhere and do everything to their heart’s content and all at a moment’s notice. Just imagine that. So why be bothering about sales at all? I mean wouldn’t you eventually get bored of simply accumulating more and more money when you’ve already got so much – and let’s face it, money itself is pretty boring stuff. So just what is it that keeps them all going after it? After all, there are only so many swimming pools, grand pianos, swimming pools in the shape of grand pianos, Aston Martins, Lear Jets, and acreages of real estate that one man (or woman) can profitably use (in the non-profit-making sense obviously). Economists would call this the law of diminishing marginal utility, although in this instance it is basic common sense.2

Presented with evidence of this kind, some will say that here is further proof of the essential greediness of human beings. That, as a species, we are simply never satisfied until we have the lot. Fine then, let us take on this modern variant of original sin, since it certainly holds more than a grain of truth. For the sake of argument, we might presume that all men and women are greedy to an almost limitless extent. That this is truly the natural order, from our conception having been evolutionarily programmed to grab as much as we can for ourselves – our most primeval reflex being to snatch.

So I shall not waste too much time here. Only to say that I do not find such unrestrained cupidity within the circles of people with whom I have chosen to associate, most being happy enough to share out the peanuts and fork out for the next round of beers, quite oblivious to outcomes in terms of commensurate returns. What comes around goes around… There is, of course, no doubting that most folks will, very naturally, if opportunity arises, take good advantage to feather their own nests. Making life a little more comfortable for themselves, and reserving the ample share of their fortune for their immediate family and closest friends. But then, why not…? Charity begins at home, right?

What most don’t do (at least in the circles I know best) is devote their whole lives to the narrow utilitarian project outlined above. And why? Because, though quite understandably, money and property are greatly prized assets, they offer lesser rewards than companionship and love. And, in any case, pure generosity is its own reward – and I do mean “is”, and not “has” or “brings” – the reward being an inseparable part of the act itself: a something received as it was given, like a hug, like a kiss. That said, if you still prefer to believe that we are all to a man, woman and child, innately and incurably selfish and greedy, then next time you take a look into the mirror, do consider those all-too beady eyes staring back. It’s very easy to generalise about mankind when you forget to count yourself in.

But if not intractably a part of human nature, then we must find other reasons to account for how our world is nevertheless so horribly disfigured by rampant and greedy exploitation. For if greed is not an inherently human trait, and here I mean greed with a capital Grrr, then this monomaniacal obsession is all too frequently acquired, especially in those who approach the top of the greasy pole. There is an obvious circularity in this, of course. That those whose progress has depended upon making a buck, very often become addicted. As money-junkies, they, like other addicts, then prioritise their own fix above all else. Whether or not these types are congenitally predisposed to becoming excessively greedy, we have no way of knowing. What we can be certain of is this: that by virtue of having acquired such great wealth, they disproportionately shape the environment they and we live in. So they are not merely money-junkies, but also money-pushers. If you’re not a money-junkie then you don’t know what you’re missing. There’s nothing new in this. This is the way the world has been for many centuries, and perhaps ever since money was first invented.

So here’s Oscar Wilde addressing the same questions about money and our unhealthy relationship to it; his thoughts leaping more than a century, during which time very little has apparently changed:

“In a community like ours, where property confers immense distinction, social position, honour, respect, titles, and other pleasant things of this kind, man, being naturally ambitious, makes it his aim to accumulate this property, and goes on wearily and tediously accumulating it long after he has got far more than he wants, or can use, or enjoy, or perhaps even know of. Man will kill himself by overwork in order to secure property, and really, considering the enormous advantages that property brings, one is hardly surprised. One’s regret is that society should be constructed on such a basis that man has been forced into a groove in which he cannot freely develop what is wonderful, and fascinating, and delightful in him – in which, in fact, he misses the true pleasure of joy and living.”3

Embedded below is a recent interview [from December 2013] Pulitzer Prize-winning journalist Chris Hedges gave on “The Real News” in which he talked about – based to a large extent on his own personal experience – how the super rich are isolated and disconnected from the rest of society. He explains how this creates a deluded sense of entitlement and a pathological callousness:

*

Isn’t money funny stuff! Funny peculiar, I mean. We just take it so much for granted, almost as though it were a natural substance (disappointingly, of course, it doesn’t actually grow on trees). But when we do think about it, money has far stranger properties than anything in the natural world. And our relationship to it is more peculiar than our relationship to almost anything else.

Money, that’s what I want… sang the Beatles on one of their less celebrated tracks. But the truth will out. So just why did the Beatles want money, and, for that matter, why do I, and why do you? It doesn’t work, you can’t eat it, and it’s not, of a rule, a thing of special beauty. Money is absolutely useless in fact, right until you decide to swap it for what you actually want.

Money can’t buy me love, true again, but it might buy me a chocolate bar. Because money is really just a tool, a technology: a highly specialised kind of lubricant, that enables people to exchange their goods and services with greater ease and flexibility. The adoption of a money system enabling levels of parity for otherwise complex exchanges to be quickly agreed and settled. The great thing about money being, to provide a concrete illustration, that although £1 of tinned herring is probably equivalent to about thirty seconds of emergency plumbing (if you’re lucky), you won’t require crates of herring to pay for the call-out. So far so simple.

Except wait. We all know how the price of herring can go up as well as down, and likewise for the price of emergency plumbers. So why such a dynamic relationship? Well, there’s “the market”, a price-fixing system that arises spontaneously, regulating the rates of exchange between goods and services on the basis of supply adjusting to match demand. Thus by a stroke of good fortune, we find that money is not merely a lubricant for exchange, but also regulatory of useful production and services. This, at least, is the (widely accepted) theory.

Prices rise and fall in accordance with demand. Things that are in short supply become expensive, things that are abundant are cheaper. This is basic economic theory and it means, amongst other things, that in every transaction the “real value” of your money is actually relative, for the simple reason that the amount required depends not only on what you’re after, but also upon whether or not other people are after the same kind of thing. Money then, in terms of its “real value” to any individual or group, is something that is constantly varying. We might call this “the relativity of money”.

One consequence of the relative nature of money, is that the useful value of money overall can also rise and fall. It is possible that wholesale, retail and labour costs can all more or less rise or fall together, although the general tendency, as we all know from experience, is for overall rising costs. Indeed such “inflation” is regarded as normal and expected, and, as a consequence, it comes to seem just as natural as money itself. Yet since you always need more and more money to buy the same things then the value of your money must, in some important way, be constantly falling. But just why does money as a whole lose its value in this way? What makes yesterday’s money worth less than today’s? Well it turns out that this is a huge question and one that economists have argued long and hard about.

One partial account of inflation goes as follows: businesses and people in business are constantly looking for a little bit more. For how else can they maximise profits? In direct consequence, we, as customers, necessarily require more dosh to pay for the same goods or services. But to enlarge our budget, this automatically requires a commensurate increase in income, which means successfully negotiating for a larger salary. In the bigger picture then, the businesses supplying our wants and needs, are now needing to cover their larger wage-bills, which means higher prices to compensate. So prices and incomes rise together, with money becoming worth less and less precisely because everyone is trying to accumulate more and more of it. This endless tail-chasing escalation, which is given the fancy title of “the price/wage spiral”, serves as an excellent example of why money is really very odd stuff indeed.

And what is money in any case? The first traders most likely exchanged shells, precious stones, or other baubles to aid in bartering, but then naturally enough, over time these exchanges would have been formalised, agreements arising with regards to which objects and materials were most acceptable as currency. The material that became most widely accepted was eventually, of course, gold. But why gold? Well, no one actually knows but we can make some educated guesses.

Firstly, gold is scarce, and it is also rare in other ways – for instance, having a unique and unusual colour, which just happens to correspond to the colour of the Sun. The fact that it is almost chemically inert and so doesn’t tarnish, means that it also shines eternally, and so again is like the Sun. Indeed, Aldous Huxley, in Heaven and Hell (his sequel to The Doors of Perception) points out that almost every substance that humans have ever regarded as valuable shares this property of shininess. To Huxley this is evidence that even money owes it origins, in part at least, to a common spiritual longing. Our wish to own a precious piece of paradise.

But back to more mundane matters, if gold (or any other substance) is chosen as your currency, then there arises another problem. How to guarantee the quantity and quality of the gold in circulation? For if gold is worth faking or adulterating then it’s certain that somebody will try cheating.

Well, one answer could be the adoption of some kind of official seal, a hallmark, and this solution leads, naturally enough, to the earliest forms of coinage. But then, if the coins are difficult to counterfeit, why bother to make them out of gold in the first place? Just the official seal would be enough to ensure authenticity. And why bother with metal, which is bulky and heavy. So again it’s an obvious and logical leap to begin producing paper banknotes. The value of these coins and banknotes, although far less intrinsically valuable in material terms than the gold they represent, is still backed by the promise that they are redeemable into gold. But hang on, what’s so special about the gold anyway (aside from its shininess). And doesn’t the gold, which is now locked up in bullion reserves, in fact have real uses of its own? And doesn’t this mean that the gold also has a monetary value? So why not cut loose from the circularity and admit that the value of money can exist entirely independent from the gold or from any other common standard. Indeed, why couldn’t the issuing authority, which might be a government but is more often a central bank, simply make up a “legal tender”4 with no intrinsic or directly correlated value whatsoever and issue that? Not that the money issued need even correspond to the amount of real coins or paper banknotes in circulation – most of the world’s money being bits and bytes, ones and zeroes, orbiting out in cyber-space. Which brings us to just how funny money has now become.

The Pound Sterling, the various dollars, the Euro and every major currency on Earth are, to apply the correct terminology, “fiat currencies”5 With fiat currencies there is no parity to the value of any other commodities and so they are, if you like, new forms of gold. As such, and given their shifting relative values, these new fiat currencies can also be traded as another kind of commodity. Money, in the form of currency, becoming an investment in itself. Money is strange stuff indeed.

Yet money also remains as an instrument. And we use this instrument to measure just about everything. To establish the value of raw materials and manufactured items. The value of land and, by extension, the value of the space it occupies. The value of labour, and thus a value on the time used. And, since works of art are also bought and sold, money is even applied as a measure of such absolutely intangible qualities as beauty.

So money is basically a universally adaptable gauge, and this is its great strength. It is perhaps the big reason why its invention gradually caught on in such a fundamental way. From humble trading token, money has risen to become a primary measure of all things. But remember, remember… Money, whether fiat currency or gold standard, can never be real in the same way as tins of herring and plumbers are real, and neither is “monetary value” an absolute and intrinsic property, but only ever relative and acquired. Money, we ought to constantly remind ourselves (since we clearly need reminding) is nothing without us or without our highly structured civilisation – intrinsically, it is worthless. It is very strange stuff.

Perhaps the future benchmark for money will no longer be gold but ‘virtual gold’ in the form of cryptocurrencies – bitcoin being currently the most well-known of these. One advocate of these alternatives to traditional forms of money is financial expert Max Keiser. On February 3rd 2014, he spoke with coder, hacker and cryptocurrency specialist Andreas Antonopoulos about the regulation of bitcoin transactions; the advent of bitcoin derivatives, which he believes these are less of a threat than ordinary derivatives (a subject I’m coming to next); the fact that unlike gold, cryptocurrencies can be ‘teleported’; and a future in which bitcoin is used widely by businesses as much as by individuals. He says that a time is coming when the prevalent misgivings and doubts about bitcoin and other cryptos have long since been forgotten. Is he right? I don’t know and remain highly skeptical, but I find the debate an interesting one:

Incidentally, there are less radical and more tangible alternatives to the currencies we now have in circulation. “Treasury notes” are one such alternative and these have historical precedence in the form of both the American “greenback” and the UK’s Bradbury Pound. To read more about this and also for links to campaigns to reintroduce them please read the addendum at the end of the chapter.

*

Little more than a century ago, and even in the richest corners of the world, there were no dependable mechanisms to safeguard against the vicissitudes of fortune. If you weren’t already poor and hungry (as most were), then you could rest assured that potential poverty and hunger were waiting just around the corner. Anyone with aspirations to scale the ladder to secure prosperity faced the almost insurmountable barriers of class and (a generally corresponding) lack of education. A lower class person of such ambitions would be very well aware that if they could step onto the ladder at all, there was very little in the way of protection to save them in the event of falling; errors of judgement or sheer misfortune resulting in almost certain and unmitigated personal disaster. This was the sorry situation for people at all levels of society aside from the highest echelons.

One tremendous advantage then, of living in a modern society, is that, aside from having slightly less restricted social mobility (not that we now live in the classless society we are told to believe in), there are basic safety nets in place, with additional protection that is optionally available. For those languishing at the bottom of the heap, there are the reliable though meagre alms provided through a welfare system, whilst for the ever-expanding middle classes there is plenty of extra cover in the form of saving schemes, pension schemes, and, in the event of the most capricious and/or calamitous of misfortunes, the ever-expanding option of insurance policies. If the Merchant of Venice had been set in today’s world then the audience would feel little sympathy for his predicament. Why had he ventured on such a risk in the first place, casting his fortune adrift on dangerous waters? Why hadn’t he protected his assets by seeking independent financial advice and taking out some preferential cover? It’s a duller story altogether.

Systems for insurance are essential in any progressive civilisation. Protection against theft, against damage caused by floods, fires and other agents of destruction, and against loss of life and earnings. Having insurance means that we can all relax a bit, quite a lot, in fact. But it also means that, alongside the usual commodities, there’s another less tangible factor to be costed and valued. That risk itself needs to be given a price, and that necessarily means speculating about the future.

Indeed, speculations about the future have become very much to the forefront of financial trading. As a consequence of this, at least in part, today’s financial traders have become accustomed to dealing in “commodities” that have no intrinsic use or value whatsoever. They might, for example, exchange government bonds for promises of debt repayment. Or, feeling a little more adventurous, they might speculate on the basis of future rates of foreign exchange, or in interest rates, or share prices, or rates of inflation, or in a multitude of other kinds of “underlying assets” (including that most changeable of underlying variables: the weather) by exchange of promissory notes known most commonly as “derivatives”, since they derive their value entirely on the basis of the future value of something else. And derivatives can be “structured” in any myriad of ways. Here are a just few you may have heard of :–

  • futures (or forwards) are contracts to buy or sell the “underlying asset” up until a future date on the basis of today’s price.
  • options allow the holder the right, without obligation (hence “option”), to buy (a “call option”) or to sell (a “put option”) the “underlying asset.”
  • swaps are contracts agreeing to exchange money up until a specified future date, based on the underlying value of exchange rates, interest rates, commodity prices, stocks, bonds, etc.

You name it: there are now paper promises for paper promises of every conceivable kind. Now the thing is that because you don’t need to own the “underlying asset” itself, there is no limit to the amounts of these paper promises that can be traded. Not that this is as novel as it may first appear.

Anyone who’s ever bought a lottery ticket has in effect speculated on a derivative, its value in this case being entirely dependent upon the random motion of coloured balls in a large transparent tumbler at an allocated future time. All betting works this way, and so all bets are familiar forms of derivatives. And then there are, if you like, negative bets. Bets you’d rather lose. For instance, £200 says my house will burn down this year, is presumably a bet you’d rather lose, but it is still a bet that many of us annually make with an insurance company. And general insurance policies are indeed another form of familiar derivative – they are in effect “put options”.

However there is one extremely important difference here between an ordinary insurance policy and a “put option” – in the case of the “put option”, you don’t actually need to own the “underlying asset”, which means, to draw an obvious comparison, you might take out house insurance on your neighbour’s property rather than your own. And if their house burns down, ah hum accidentally, of course, then good for you. Cash in your paper promise and buy a few more – who knows, perhaps your neighbour is also a terrible driver. There are almost numberless opportunities for insuring other people’s assets and with only the law preventing you, then why not change the law. Which is exactly what has happened, with some kinds of derivatives circumventing the law in precisely this way, and permitting profitable speculation on the basis of third party failures. When it comes to derivatives then, someone can always be making a profit come rain or shine, come boom or total financial meltdown.

But, why stop there? Especially when the next step is so obvious that it almost seems inevitable. Yes, why not trade in speculations on the future value of the derivatives themselves? After all, treating the derivative itself as an “underlying asset” opens the way for multiple higher order derivatives, creating with it, the opportunity for still more financial “products” to be traded. Sure, these “exotic financial instruments” quickly become so complex and convoluted that you literally need a degree in mathematics in order to begin to decipher them. Indeed those on the inside make use of what are called “the Greeks”, and “the Higher Order Greeks”, since valuation requires the application of complex mathematical formulas comprised of strings of Greek letters, the traders here fully aware, of course, that it’s all Greek to the rest of us. Never mind – ever more financial “products” means ever more trade, and that’s to the benefit of all, right…?

Deregulation of the markets – kicked off in Britain by the Thatcher government’s so-called “Big Bang” and simultaneously across the Atlantic through the laissez-faire of “Reagonomics”6 – both enabled and encouraged this giddying maelstrom, allowing in the process the banking and insurance firms, the stockbrokerage and hedge funds that make up today’s “finance industry” to become the single most important “wealth creator” in the Anglo-American world. Meanwhile, declines in manufacturing output in Britain and America meant both nations were becoming increasingly dependent on a sustained growth in the financial sector – with “derivatives” satisfying that requirement for growth by virtue of their seemingly unbound potential. Indeed, having risen to become by far the largest business sector simply in terms of profit-making, many of the largest banks and insurance groups had become “too big to fail”7. Failure leading potentially to national, if not international, economic ruin. Which is how the very systems that were supposedly designed to protect us, systems of insurance, have, whether by accident or design, left us more vulnerable than ever.

And then came the bombshell, as we learnt that the banks themselves were becoming bankrupt, having gambled their investments in the frenzy of deregulated speculation. Turns out that some of the money-men didn’t fully understand the complexity of their own systems; a few admitting with hindsight that they’d little more knowledge of what they were buying into than the rest of us. They’d “invested” because their competitors “invested”, and, given the ever-growing buoyancy of the markets at the time, not following suit would have left them at a competitive disadvantage. A desperate but strangely appropriate response to the demands of free market capitalism gone wild.

*

It is currently estimated that somewhere in the order of a quadrillion US dollars (yes, that’s with a qu-) has been staked on derivations of various kinds. Believe it or not, the precise figure is actually uncertain because many deals are brokered in private. In the jargon of the trade these are called “over the counter” derivatives, which is an odd choice of jargon when the only thing the average customer buys over the counter are drugs. Could it be that they’re unconsciously trying to tell us something again?

So just how big is one quadrillion dollars? Well, let’s begin with quadrillion. Quadrillion means a thousand trillion. Written at length it is one with a string of fifteen zeros. A number so humungous that it’s humanly impossible to properly comprehend: all comparisons fail. I read somewhere that if you took a quadrillion pound coins and put them side by side then they would stretch further than the edge of the solar system. The Voyager space programme was, of course, a much cheaper alternative. Or how about this: counting a number every second, it would take 32 million years to count up to a quadrillion… Now obviously that’s simply impossible – I mean just try saying “nine hundred and ninety-nine trillion, nine hundred and ninety-nine billion, nine hundred and ninety-nine million, nine hundred and ninety-nine thousand, nine hundred and ninety-nine” in the space of one second! You see it really doesn’t help to try to imagine any number as big as a quadrillion.

However, there are still useful ways to compare a quadrillion dollars. For instance, we can compare it against the entire world GDP which turns out to be a mere 60 trillion US dollars8. One quadrillion being nearly twenty times larger. Or we might compare it against the estimated monetary wealth of the whole world: about $75 trillion in real estate, and a further $100 trillion in world stock and bonds. So one quadrillion is a number far exceeding even the total monetary value of the entire world – material and immaterial! A little freaky to say the least! Especially when we discover that many of these derivatives are now considered to be “toxic assets”, which is a characteristically misleading way of saying they are worth nothing – yes, worthless assets! – whatever the hell that means!

So just like the Sorcerer’s Apprentice, it seems that the spell has gone out of control, and instead of these mysterious engines making new money out of old money, the system has created instead an enormous black hole of debt. A debt that we, the people, are now in the process of bailing out, with extremely painful consequences. Efforts to save us from a greater catastrophe having already forced the British and US governments to pump multiple hundreds of billions of public money into the coffers of the private banks. Yet the banks and the economy remain broken of course, because how is any debt larger than the monetary value of the entire world ever to be repaid?

Another tactic to halt descent into a full-blown economic meltdown has involved the issuance of additional fiat currency in both Britain and America; a “quantitative easing” designed to increase the supply of money by simply conjuring it up (a trick that fiat currency happily permits). Money may not grow on trees but it can most certainly be produced out of thin air. But here’s the rub. For in accordance with the most basic tenets of economic theory, whenever extra banknotes are introduced into circulation, the currency is correspondingly devalued. So you may be able to conjure money from thin air, but all economists will readily agree that you cannot conjure “real value”, meaning real purchasing power. Indeed this common mistake of confusing “nominal value” (i.e., the number of pounds written on the banknote) with “real value”, is actually given a name by economists. They call it: “the money illusion”. And it’s useful to remind ourselves again that money has only relative value.

To understand this, we might again consider money to be a commodity (which in part it is, traded on the currency markets). As such, and as with all other commodities, relative scarcity or abundance will alter its market value, and, in obedience to the law of supply and demand, more will automatically mean less. This is just as true for the value of money as it is for tins of herring, plumbers, scotch eggs and diamonds. So it seems that if too much of our quantitative is eased, then we’d better be prepared for a drastic rise in inflation, or much worse again, for hyperinflation. Printing too much money is how hyperinflation has always been caused.

Our future is bleak, they tell us. Our future is in the red. So much for security, so much for insurance. We’d apparently forgotten to beware of “the Greeks” and of the “higher order Greeks” when they’d first proffered gifts.

*

I said earlier, just in passing, that money is actually pretty boring stuff, and it is… Truly, madly and deeply boring! So when I hear on the news how “the markets” are hoping that the latest round of “quantitative easing” will enable governments to provide the necessary “fiscal stimulus”, I am barely even titillated. Whilst explanations, both in the popular press and supposedly more serious media, that like to describe such injections of new money as in some way analogous to filling up my car with imaginary petrol provide me only with a far, far more entertaining distraction: to wit, a magical car that runs on air.

But then, of course, money isn’t really stuff at all! More properly considered, money is perhaps a sort of proto-derivative, since its worth is evidently dependent upon something other than the paper it’s (increasingly not) written on. So what is it that money’s worth depends upon? What underlies money? Well, the accepted answer to this question is apparently that money is a “store of value”. Although this leads immediately to the obvious follow-up question: in this context, what precisely is the meaning of “value”? But, here again there is a problem, since “value”, although a keystone to economic thinking, has remained something of an enigma. Economists unable to agree upon any single definitive meaning.

Is “value” a determinant of usefulness? Or is it generated by the amount of effort required in the production of things? Or perhaps there is some other kind of innate economic worth? For instance in a thing’s scarcity. And can this worth be attributed at the individual level or only socially imputed?

There are a wide variety of definitions and explanations of “value”, that, being so foundational, have then encouraged the various branches of economic theory to diverge. And here is another important reason why economics is in no way equivalent to the physical sciences. Ask any physicist what energy is, and they will provide both an unambiguous definition and, no less importantly, offer established methods for measurement. Because of this, if ever one physicist talks to another physicist about energy (or any other physical quantity) they can be absolutely certain that they are talking about the same thing. Which is very certainly not the case when economists talk about “value”.

“A cynic is a man who knows the price of everything and the value of nothing,” said Oscar Wilde, distinguishing with playful wisdom the difference in human terms between “price” and “value”. The great pity is that the overwhelming majority of today’s economists have become so cynical – but then perhaps they always were.

*

As part of his on-going assault against religion, Richard Dawkins recently published a book called The God Delusion. It’s the old hobby-horse again; one that he shares with a great many millions of other broadly liberal, literate and intelligent people. That religion is an evil of which humanity must rid ourselves totally. And yes, much of religion has been dumb and dangerous, this I will very readily concede (and already have conceded in earlier chapters). But really and truly, is it “the God delusion” that we should be most concerned about in these torrid times? For regardless of Dawkins claims, it is quite evident that religion is a wounded animal, and for good or ill, the secular world is most certainly in the ascendant. Right throughout the world, aside from a few retreating pockets of resistance, faith in the old gods has been gravely shaken. It is not that human faith, by which I mean merely a belief and/or worship of something greater, is extinguished, for it never can be, but that it has been reattached to new idol-ologies. And in those parts of the world where the old religions have been most effectively disarmed or expelled, namely the West, one idol-ology above all others has gathered strength from Religion’s demise.

Richard Dawkins has said many times that instructing young children in religious obedience is a form of psychological child abuse and on this point I wholeheartedly support him. Children’s minds are naturally pliable for very sound developmental reasons. But is it less pernicious to fill their precious minds with boundless affection for let’s say Ronald McDonald? For this is merely one stark but obvious illustration of how a new fundamentalism has been inculcated in the young. Devotion to the brand. Love of corporations. Worship of the dollar and the pound.

This new kind of fundamentalism has long since swept across the world, but it is unusual, although not unique, in that it denies its own inherent religiosity whilst claiming to have no idols. This is the fundamentalism of free market neoliberal economics. The Father, Son and Holy Ghost having been forsaken, only to have been usurped by the IMF, the World Bank and the WTO. If you think I’m joking, or that this is mere hyperbole, then think again. When things are tough we no longer turn to the heavens, but instead ask what sacrifices can be made to “reassure the markets”. Sacrifices to make it rain money again.

By far and above, here is the most pernicious delusion of our age. And it has next to nothing to do with God, or Yahweh, or Allah, or even the Buddha. The prophets of our times talk of nothing besides profits or losses. They turn their eyes to the Dow Jones Index, trusting not in God, but only in money. So I call for Dawkins to leave aside his God delusion, for a moment, and pay a little attention to the rise and rise of “the money delusion”. If future historians reflect on our times, this is what they will see, and given the mess this “money delusion” is creating they will scratch their heads in disbelief and disgust.

*

I have already discussed the so-called “money illusion” – of mistaking nominal banknote value for real purchasing value – but this is merely one of many nested and interrelated illusions that make up “the money delusion”. Illusions that have become so ingrained within our permitted economic thinking that they are completely taken for granted.

Foundational is the belief that individuals always make rational choices. According to the definition of making rational choices, this requires that we all choose with consistency and always with the aim of choosing more over less. That a huge advertising industry now exists to tempt us into irrationality is never factored in. Nor are the other corrosive influences that so obviously deflect our rational intentions: the coercion of peer pressure, our widespread obsession with celebrities and celebrity endorsement, and that never-ending pseudo-scientific babble that fills up many of the remaining column inches and broadcast hours of our commercial media. We are always eager for the latest fashionable fads, and perhaps we always were. Yet this glaring fact, that people make wholly irrational choices time and again, whether due to innate human irrationality or by deliberate design, is of little concern to most economists. It is overlooked and omitted.

Likewise, a shared opinion has arisen under the name of neoliberalism that economics can itself be neutral, usefully shaping the world without the nuisance of having to rely on value judgements or needing any broader social agenda. If only individuals were left to make rational choices, as of course they do by definition, or so the idea goes, and the market could also be unshackled, then at last the people will be free to choose. Thus, goes the claim, individual freedom can only be guaranteed by having freedom within the marketplace. Freedom trickling down with the money it brings. “Wealth creation” alone must solve our problems by virtue of it being an unmitigated good.

Of course, back in the real world, one man’s timber very often involves the destruction of another man’s forest. Making profits from the sale of drugs, tobacco and alcohol has social consequences. Factories pollute. Wealth creation has its costs, which are very often hidden. There is, in other words, and more often than not, some direct negative impact on a third party, known to economists as “spillover” or “externalities”, that is difficult to quantify. Or we might say that “wealth creation” for some is rather likely therefore to lead to “illth creation” for others.

Illth creation? This was the term coined by romantic artist, critic and social reformer, John Ruskin, and first used in his influential critique of nineteenth century capitalism entitled Unto This Last. Ruskin had presumably never heard of “the trickle-down effect”:

“The whole question, therefore, respecting not only the advantage, but even the quantity, of national wealth, resolves itself finally into one of abstract justice. It is impossible to conclude, of any given mass of acquired wealth, merely by the fact of its existence, whether it signifies good or evil to the nation in the midst of which it exists. Its real value depends on the moral sign attached to it, just as sternly as that of a mathematical quantity depends on the algebraical sign attached to it. Any given accumulation of commercial wealth may be indicative, on the one hand, of faithful industries, progressive energies, and productive ingenuities: or, on the other, it may be indicative of mortal luxury, merciless tyranny, ruinous chicane.”9

*

We are in the habit of regarding all money as equal. Presuming that the pounds and pence which make up my own meagre savings are equivalent in some directly proportional manner to the billions owned by let’s say George Soros. A cursory consideration shows how this is laughable.

For instance, we might recall that on “Black Wednesday” in 1992, Soros single-handedly shook the British economy (although, the then-Chancellor of the Exchequer Norman Lamont was left to shoulder the blame)10. But to illustrate this point a little further, let me tell you about my own small venture into the property market.

Lucky enough to have been bequeathed a tidy though not considerable fortune, I recently decided to purchase a house to live in. The amount, although not inconsiderable by everyday standards (if compared say with the income and savings of Mr and Mrs Average), and very gratefully received, was barely sufficient to cover local house prices, except that I had one enormous advantage: I had cash, and cash is king.

For reasons of convenience, cash is worth significantly more than nominally equivalent amounts of borrowed money. In this instance I can estimate that it was probably worth a further 20–30%. Enough to buy a far nicer house than if I’d needed to see my bank manager. A bird in the hand…

Having more money also has other advantages. One very obvious example being that it enables bulk purchases, which being cheaper, again inflates its relative value. The rule in fact is perfectly straightforward: when it comes to money, more is always more, and in sufficient quantities, it is much, much more than that.

But then, of course, we have the market itself. The market that is supposedly free and thus equal. The reality being, however, that since money accumulates by virtue of attracting its own likeness, the leading players in the market, whether wealthy individuals or giant corporations, by wielding larger capital resources, can operate with an unassailable competitive advantage. These financial giants can and do stack the odds even higher in their favour by more indirect means, such as buying political influence with donations to campaign funds and by other insidious means such as lobbying – all of which is simply legally permitted bribery. The flaunted notion of a free market is therefore the biggest nonsense of all. There is no such thing as a free market: never has been and never will be.

The most ardent supporters of free market neoliberalism say that it is a non-normative system, which permits us finally to rid ourselves of disagreements over pesky value judgements. The truth, however, is very much simpler. By ignoring values, it becomes a system devoid of all moral underpinning. Being morally bankrupt, it is unscrupulous in the truest sense of the word.

*

If I had enough money and a whim, I might choose to buy all the plumbers and tins of herrings in Britain. Then, since money is (in part) a measure of scarcity, I could sell them back later with a sizeable mark-up. Too far-fetched? Well, perhaps, but only in my choice of commodity. The market in other commodities has without any question been cornered many times in the past. For instance, by the end of the 1970s, two brothers, Nelson Bunker and William Herbert Hunt, had accumulated and held what was then estimated to be one third of all the world’s silver. This led to serious problems both for high-street jewellers11 and for the economy more generally12, and as it happened, when the bubble burst on what became know as “Silver Thursday”, it also spelt trouble for the brothers’ own fortune. Fortunately for them, however, the situation was considered so serious that a consortium of banks came forward to help to bail them out13. They had lost, their fortune diminished, although by no means wiped out. As relatively small players they’d played too rough; meanwhile much larger players ensure that the markets are routinely rigged through such manufacture of scarcity. Going back as early as 1860, John Ruskin had already pointed out a different but closely-related deficiency in any market-driven capitalist system of trade:

“Take another example, more consistent with the ordinary course of affairs of trade. Suppose that three men, instead of two, formed the little isolated republic, and found themselves obliged to separate, in order to farm different pieces of land at some distance from each other along the coast: each estate furnishing a distinct kind of produce, and each more or less in need of the material raised on the other. Suppose that the third man, in order to save the time of all three, undertakes simply to superintend the transference of commodities from one farm to the other; on condition of receiving some sufficiently remunerative share of every parcel of goods conveyed, or of some other parcel received in exchange for it.

“If this carrier or messenger always brings to each estate, from the other, what is chiefly wanted, at the right time, the operations of the two farmers will go on prosperously, and the largest possible result in produce, or wealth, will be attained by the little community. But suppose no intercourse between the landowners is possible, except through the travelling agent; and that, after a time, this agent, watching the course of each man’s agriculture, keeps back the articles with which he has been entrusted until there comes a period of extreme necessity for them, on one side or other, and then exacts in exchange for them all that the distressed farmer can spare of other kinds of produce: it is easy to see that by ingeniously watching his opportunities, he might possess himself regularly of the greater part of the superfluous produce of the two estates, and at last, in some year of severest trial or scarcity, purchase both for himself and maintain the former proprietors thenceforward as his labourers or servants.”14

By restricting the choices of others, one’s power over them is increased, and it this that brings us to the real reason why money becomes such addiction, especially for those who already have more than they know what to do with. For truly the absolute bottom line is this: that money and power become almost inseparable unless somehow a separation can be enforced. And whilst wealth, especially when excessive, accumulates, as it almost invariably does, then along with it goes the accumulation of power. This is underlying and centralising mechanism has perhaps always operated at the heart of all civilisation. But even the power of money has its limits, as Ruskin points out:

“It has been shown that the chief value and virtue of money consists in its having power over human beings; that, without this power, large material possessions are useless, and to any person possessing such power, comparatively unnecessary. But power over human beings is attainable by other means than by money. As I said a few pages back, the money power is always imperfect and doubtful; there are many things which cannot be reached with it, others which cannot be retained by it. Many joys may be given to men which cannot be bought for gold, and many fidelities found in them which cannot be rewarded with it.

“Trite enough, – the reader thinks. Yes: but it is not so trite, – I wish it were, – that in this moral power, quite inscrutable and immeasurable though it be, there is a monetary value just as real as that represented by more ponderous currencies. A man’s hand may be full of invisible gold, and the wave of it, or the grasp, shall do more than another’s with a shower of bullion. This invisible gold, also, does not necessarily diminish in spending. Political economists will do well some day to take heed of it, though they cannot take measure.”15

Until such a time, every action and probable outcome must continue to be evaluated on the basis of strict cost and benefit estimates. Our “ponderous currencies” literally enabling a figure to be set against each human life – an application fraught with the most serious moral dilemmas and objections – and beyond even this, we have price tags for protecting (or else ruining) the natural environment all our lives depend upon. For only the market can secure our futures, optimally delivering us from evil, though inevitably it moves in mysterious ways. Which is how the whole world – land, water, air and every living organism – came to be priced and costed. Everything set against a notional scale that judges exclusively in terms of usefulness and availability, such is the madness of our money delusion.

We are reaching a crisis point. A thoroughgoing reappraisal of our financial systems, our economic orthodoxes, and our attitudes to money per se is desperately required. Our survival as a species may depend on it. Money ought to be our useful servant, but instead remains, at least for the vast majority, a terrible master. As a consequence, our real wealth has been too long overlooked. Time then for this genii called money to be forced back tight inside its bottle. Ceaselessly chasing its golden behind, and mistaking its tight fist for the judicious hand of God, is leading us ever further down the garden path. Further and further away from the land it promises.

Next chapter…

*

 Addendum: Q & A

Back in April 2012, I forwarded a draft of this chapter to friends in Spain (a nation already suffering under imposed “austerity measures”). They sent an extended reply which raised two interesting and important questions. Both questions along with my replies are offered below:

Q1: You seem to be saying that printing money (as the US and UK, who are in control of their own currency, are doing ) is as bad as dealing with the debt problem by means of austerity (the “Merkozy” approach). But the latter is surely definitely worse.

A. I think these are simply two sides of the same scam. The bankers create an enormous unpayable debt and then get governments to create new money to bail them out. This is sold to us as a way of bailing out a few chosen victims (Greece, Spain, Portugal, Ireland) although it simply means a huge transfer of wealth from public into private hands. To make that money useful to the bankers (and the rest of the ruling elite) ‘austerity measures’ are put in place which not only steal money off the average person but also permit the fire sale of national assets. Meanwhile, in Britain and America, the governments are helping to pay for these bailouts by creating money out of thin air, which means the real value of our money is reduced through inflation (effectively a hidden tax). If the money were invested in infrastructure or education or whatever, then this could potentially be a good thing (even though it still creates inflation), so certainly QE could have been beneficial but not when you use the money only to keep afloat a huge Ponzi scheme. But then you ask later…

Q2: ‘but how come the pound is high now and the euro low’

A. That’s a very good question and I won’t pretend that I understand this completely, but I gather there are plenty of ways for keeping currencies higher than they ought to be by manipulating the markets [incidentally, the Forex Scandal to manipulate and rig the daily foreign exchange rates did not come to light until Summer 2013]. The market is rigged in any case by virtue of the fact that the dollar remains the world’s reserve currency and that oil is traded entirely in dollars. But essentially what’s going on here is a huge currency war, and the euro is constantly under attack from speculators. I am fairly certain that the chickens will come home to roost sooner or later in America and Britain (and in Germany too), but meanwhile the governments simply go about cooking the books and telling us how inflation is only 4% or whatever when fuel prices, for instance, have rocketed during the past few years. In any case, we get ‘austerity’ too, not as hardline yet as the ‘austerity’ being imposed elsewhere, but it will come – of this I have no doubt. Either it will happen slowly, or worse, there will be a huge war and the ‘austerity’ will be brought into place to justify the expense of that. This is a deliberate attack by the bankers against the people of the world, and until the people of the world say that’s enough, and most of the debts are cancelled outright, I don’t see any way this can be reversed.

*

Another topic I briefly touched upon in the chapter above is the matter of inflation. What is it and what causes it? My answers were sketchy, in part, because I wished to avoid getting too bogged down in technicalities beyond my training. But this question about the causes of inflation is, in any case, an extremely thorny one. Different schools of economists provide different explanations.

One less orthodox account that I have frequently come across is that our fractional reserve banking system when combined with a central bank’s issuance of a fiat currency is inherently inflationary. That in the long term, and solely because of these extant monetary mechanisms, inflation is baked into the cake. So I wrote to a friend who holds with the above opinion and asked if he would explain “in the briefest terms that are sufficient” why he and others believe that central bank issuance of currency and fractional reserve banking are the primary underlying cause of inflation. Here is his succinct but detailed reply:

In a central bank system, money is created in the first instance by governments issuing bonds to banks and banks “printing” money and handing it over to the government in return. The government then owe the banks the money plus interest. If they ever pay back any of the principal, then a corresponding amount of bonds are handed back, i.e. cancelled. In that case, the money repaid goes out of existence!

Before elaborating any further, let’s take a step back. Fractional reserve lending doesn’t require central banks, nor does it require governments to create money by issuing bonds in exchange for it. Fractional reserve lending is simply the act of taking someone’s money to “look after it”, then turning around and lending a fraction of it to someone else. If the lender has enough depositors, then sum of all the unlent fractions of each deposit should cover him if one of them suddenly comes through the door asking for all their money back in one go. As I’m sure you know, if too many turn up at once looking for their money, a run ensues. Fractional reserve banking doesn’t even require a government sanctioned paper currency to exist. Depositors can simply deposit something like gold and the lenders can issue receipts which become the paper currency.

In olden times, when depositors of gold first found out that the goldsmiths they were paying to store their gold safely were lending it out for a percentage fee, they were outraged. The goldsmiths appeased them by offering them a cut of the fee for their interest in the scam. Accordingly, this money became known as ‘interest’.

So where do central banks fit in? Countries like the Unites States prior to 1913 have operated without central banks. There were thousands of banks of all sizes. To compete with one another, they had to endeavour to offer higher interest to depositors, lower interest rates to borrowers or to cut the fraction of deposits that they kept in reserve. This latter aspect was what caused banks occasionally to go to the wall, to the detriment of their depositors.

Central banking avoids this risk because the same fractional reserve ratio applies to all the banks under a central bank’s jurisdiction. However, it is really a way to avoid competition and if the system ever does get into trouble, the government feel obliged to bail it out or risk collapse of the whole system.

Now to answer your question about inflation.

In a fractional reserve central bank system, money is created as I’ve described by the government issuing bonds to the bank, receiving money created out of thin air and having to pay interest on it. When they spend it by paying salaries of government employees, contractors, arms manufacturers and so on, that money goes straight into bank accounts and the bankers can’t wait to lend out as much of it as possible, up to the limit of whatever fractional reserve ratio applies. So now there is a double claim on the money. The government employee thinks their salary is sitting in the bank but 90 percent of it is in the pocket of a borrower who thinks it’s theirs as long as they keep up with interest. That borrower, will inevitably either put the borrowed sum in their own bank account or spend it. Either way it will end up in another bank account somewhere. Then the same thing happens again; up to 90 percent of it gets lent out (81 percent of the original government-created money) and so on…

We end up in a situation where all of the money in circulation has arisen from someone somewhere, signing the dotted line to put themselves in debt. The money isn’t backed by a commodity such as gold. Instead it is backed by the ability of the borrower to repay. All these borrowers, including the government are paying interest. If interest is to be paid on every penny in circulation, then it doesn’t take a genius to figure out that new money must be continuously ‘created’ to keep paying this. That occurs by governments constantly borrowing so that their debts keep on increasing and borrowers constantly borrowing more and more. This seems to work as long as prices, wages and asset values keep increasing. Generation after generation, workers can afford to pay more and more for the houses that they live in because the price of the house keeps going up so it looks like good collateral to the lender and also their wages keep going up, so the borrower can meet payments in the eyes of the lender.

Working out what the rate of inflation is at any given time is practically impossible. Government figures such as RPI and CPI are just another tool for the propagandists to use as they see fit at any given time. However for the banks to gain anything from the game, the rate of inflation must be:

  • less than the rate of interest paid by borrowers and;
  • greater than the rate of interest paid to savers.

This is why savers money is ‘eroded’ if they just leave it sitting in a bank account.
Now imagine a different system where:

  • governments issue paper money by printing it themselves;
  • the amount in circulation is absolutely fixed;
  • there is no central bank but there are plenty of independent banks.

In such a country, there is no need for the government to have any debt and there is ample historical evidence of nations that have existed without government debt for very long stretches of time. What borrowers there are have to find the interest by earning it from the fixed pool of currency that is in circulation. There is little need for anyone to borrow but that’s something that most people you speak to have difficulty accepting. That’s because they’ve only ever lived in a system where they spend their lives in the service of debt and cannot conceive of it being any different.

The bankers right at the top of the system aren’t out to grab hold of all the money in the world. They’re not after all the tangible in the world either. Their only goal is to ensure that as much human labour as possible is in the service of debt.

Now for something different. How can this whole thing go horribly wrong for the bankers? I don’t just mean a run on banks or a recession. That happens periodically and is known as the business cycle. People lose confidence and are reluctant to borrow for a number of years, then they regain confidence and start to borrow again and the whole thing picks up and the cycle repeats.

What can go horribly wrong is if, after generations and generations and generations of increasing prices and debts, everyone gets more spooked by debt than ever before and totally fixated on repaying it. They sell assets but there are so many folk doing that that asset prices start to decline. That spooks people further. A spiral is under way. Banks try to ‘stimulate’ the economy by lowering interest rates but there is very little confidence around, especially if asset prices are declining compared with debts and wages aren’t rising either (or may be in decline), so that the ability to repay debt is impaired. This decline can be long and protracted. Also there can be many ups and downs along the way, although the long term trend is down. Ups can be deceptive as they are perceived as “coming out of the recession” by those used to the normal business cycles we’ve experienced throughout the whole of the twentieth century. In this way, asset prices can bleed away until eventually they reach something like a tenth of of their peak value. This process can reach a very late stage before a lot of people recognise what’s really going on. This is just a scenario but one worth considering seriously. We could be in for long term deflation but it will be well under way and too late for many people in debt by the time it gets mainstream acknowledgement.

A closely-related question and one that automatically follows is why do countries bother having central banks at all? Instead of a government issuing bonds, why not directly issue the currency instead, thereby cutting out the middle men? It is an approach that actually has a number of historical precedents as pointed out in this open letter to Obama urging him to reissue ‘greenbacks’ and the campaign in Britain to print ‘treasury notes’ like the Bradbury Pound. So in a further reply to my friend I asked him, “do you think that the re-issuance of ‘greenbacks’ in America or the Bradbury Pound in the UK might offer a realistic solution to the current crisis?” His response:

The issue of greenbacks or whatever you call them (essentially government-issued money) would probably make no immediate difference. Already, the money created by quantitative easing is not working its way into the system, so why would money issued by any other means?

In the longer term, such a fundamental upheaval would make a huge difference as the government wouldn’t need to be in debt the whole time and people wouldn’t have to keep paying increasing prices for houses and cars on top of interest. Pensioners wouldn’t be on a treadmill, having to ‘invest’ their savings just in vain an effort to keep up with inflation.

There’s a risk that the government might be tempted to print more and more money, which is often cited as a point in favour of the present system. It is claimed that having to pay interest and ultimately repay the whole principal is a disincentive in this respect. However, the current system ensures constant “printing” all the time as there’s no way that everyone involved can pay interest otherwise.

There’s talk at the moment about banks charging people a few percent for holding their money on deposit, i.e “negative interest”. People think they’ll lose money as their account balances will go down over time. However, it’s no different to being paid say six percent interest at a time when inflation is at 9 percent and the cheapest loan you can get is 12 percent.

I’m amazed at how people in the alternative media can inform us that banks are going to charge us ‘negative interest’ for our deposits, express outrage and then in the next breath claim that we’re in a hyperinflationary environment. Low/negative interest is a sure sign of massive deflationary pressure. I don’t know what’s going to happen but I’m convinced that deflation’s the one to watch. It has the potential to catch people out.

Getting back to your original question, the direct issuing of money by the government would represent a seismic shift of power from bankers to governments; a shift in the right direction, no doubt. It’s only possible if everyone knows what’s exactly going on. We’re a very long way off yet. Peoples’ understanding of the banking scam is very very poor.

I would add that very much front and centre in that scam is the role of the central banks. These extraordinarily powerful commercial bodies that adopt the outward appearance of public institutions when in fact they work for commercial interests. The US Federal Reserve, for instance, is a de facto private corporation and all of its shareholders are private banks. The status of the Bank of England is more complicated. This is what the main wikipedia entry intriguingly has to tell us:

Established in 1694, it is the second oldest central bank in the world, after the Sveriges Riksbank, and the world’s 8th oldest bank. It was established to act as the English Government’s banker, and is still the banker for HM Government. The Bank was privately owned [clarification needed (Privately owned by whom? See talk page.)] from its foundation in 1694 until nationalised in 1946.[3][4] 

Original references retained.

Clarification needed indeed! Anyway, nowadays it is officially (since 1998) an ‘independent public organisation’. However, the BoE is not really as independent as it might first appear, since along with eighteen other central banks from around the world (including the US Federal Reserve) it is a member of the executive of “the central bank for central banks” – the little known Bank for International Settlements (BIS) based in Basel, Switzerland. To hear more about the history, ownership and function of this highly profitable (tax free and extraterritorial) organisation, I recommend listening to this interview with Adam LeBor, author of the recently released book The Tower of Basel:

For my own more detailed thoughts on effective remedies to the on-going financial crisis please read this earlier post.

*

Please note that for the purposes of ‘publishing’ here I have taken advantage of the option to incorporate hypertext links and embed videos – in order to distinguish additional commentary from the original text all newly incorporated text has been italised.

*

1 From “The Future”, Essays in Persuasion (1931) Ch. 5, John Maynard Keynes, CW, IX, pp.329 — 331, Economic Possibilities for our Grandchildren (1930).

2 Adam Smith applied “the law of diminishing utility” to solve “the paradox of water and diamonds”. Water is a vital resource and most precious to life and yet it is far less expensive to purchase than diamonds, comparatively useless shiny crystals, which in his own times would have been used solely for ornamentation or engraving. The reason, Smith decides, is that water is readily abundant, such that any loss or gain is of little concern to most people in most places. By contrast, the rarity of diamonds means that, although less useful overall, any loss or gain of use is more significant, or to put it more formally the “marginal utility” is greater.

3 Extract taken from The soul of man under socialism by Oscar Wilde (first published 1891).

4 Legal tender is a technical legal term that basically means an offer of payment that cannot be refused in settlement of a debt.

5 Fiat (Latin), “let it be done” meaning that these currencies are guaranteed by government decree only.

6 Milton Friedman pays homage to Ronald Reagan’s record on deregulation in an essay entitled “Freedom’s friend” published in the Wall Street Journal on June 11, 2004. Drawing evidence from The Federal Register, “records the thousands of detailed rules and regulations that federal agencies churn out in the course of a year”, Friedman contrasts Reagan’s record with that of Presidential incumbents before and since: “They [the rules and regulations] are not laws and yet they have the effect of laws and like laws impose costs and restrain activities. Here too, the period before President Reagan was one of galloping socialism. The Reagan years were ones of retreating socialism, and the post-Reagan years, of creeping socialism.” For socialism read regulation. http://online.wsj.com/news/articles/SB108691016978034663

7 Definition of “too big to fail” taken from Businessdictionary.com: “Idea that certain businesses are so important to the nation, that it would be disastrous if they were allowed to fail. This term is often applied to some of the nation’s largest banks, because if these banks were to fail, it could cause serious problems for the economy. By declaring a company too big to fail, however, it means that the government might be tempted to step in if this company gets into a bad situation, either due to problems within the company or problems from outside the company. While government bailouts or intervention might help the company survive, some opponents think that this is counterproductive, and simply helping a company that maybe should be allowed to fail. This concept was integral to the financial crisis of the late 2000s.”

8 According to IMF economic database for October 2010, World GDP is $61,963.429 billion (US dollars).

9 Unto This Last is based on a collection of four essays first published in the monthly Cornhill Magazine, 1860, and then reprinted as Unto This Last in 1862. This extract is drawn from his second essay: “The Veins of Wealth”

10 George Soros proudly explains the events of “Black Wednesday” on his official website: “In 1992, with the economy of the United Kingdom in recession, Quantum Fund’s managers anticipated that British authorities would be forced to break from the European Exchange Rate Mechanism (ERM) then in force and allow the British pound to devalue in relation to other currencies, in particular the German mark. Quantum Fund sold short (betting on a decline in value) more than $10 billion worth of pounds sterling. On September 16, 1992—later dubbed “Black Wednesday”—the British government abandoned the ERM and the pound was devalued by twenty percent.” http://www.georgesoros.com/faqs/archive/category/finance/

11Last year [1979] Bunker and his syndicate began buying silver again, this time on a truly gargantuan scale. They were soon imitated by other speculators shaken by international crises and distrustful of paper money. It was this that sent the price of silver from $6 per oz. in early 1979 to $50 per oz. in January of this year. Chairman Walter Hoving of Tiffany & Co., the famous jewelry store, was incensed. Tiffany ran an ad in the New York Times last week asserting: ‘We think it is unconscionable for anyone to hoard several billion, yes billion, dollars worth of silver and thus drive the price up so high that others must pay artificially high prices for articles made of silver from baby spoons to tea sets, as well as photographic film and other products.’” Extract taken from “He Has a Passion for Silver”, article published in Time Magazine, Monday 7April, 1980. http://content.time.com/time/magazine/article/0,9171,921964-2,00.html

12Many Government officials feared that if the Hunts were unable to meet all their debts, some Wall Street brokerage firms and some large banks might collapse.” Extract taken from “Bunker’s busted silver bubble”, article published in Time Magazine, Monday 12 May, 1980. http://content.time.com/time/magazine/article/0,9171,920875,00.html

13What may deal the Hunt fortune a fatal blow is the fallout from the brothers’ role in the great silver-price boom and bust of 1980. Thousands of investors who lost money in the debacle are suing the Hunts. On Saturday the brothers lost a civil case that could set an ominous precedent. A six-member federal jury in New York City found that the Hunts conspired to corner the silver market, and held them liable to pay $63 million in damages to Minpeco, a Peruvian mineral-marketing company that suffered heavy losses in the silver crash. Under federal antitrust law, the penalty is automatically tripled to $189 million, but after subtractions for previous settlements with Minpeco, the total value of the judgment against the Hunts is $134 million.” Extract taken from “Big bill for a bullion binge”, article published in Time Magazine, Monday 29 August, 1988. http://content.time.com/time/magazine/article/0,9171,968272-1,00.html

14 Extract also taken from the second essay, entitled: “The Veins of Wealth” of Unto This Last by John Ruskin.

15 Ibid.

Leave a comment

Filed under analysis & opinion, « finishing the rat race », financial derivatives, Max Keiser, neo-liberalism