Tag Archives: Freeman Dyson

aimless weather

The following article is Chapter One of a book entitled Finishing The Rat Race.

All previously uploaded chapters are available (in sequence) by following the link above or from category link in the main menu, where you will also find a table of contents and a preface on why I started writing it.

*

“Let no one enter here who does not have faith”

— Inscription over the door on Max Planck’s Laboratory

*

“In the beginning God created the heaven and the earth. And the earth was without form, and void; and darkness was upon the face of the deep. And the spirit of God moved upon the face of the waters. And God said, Let there be light: and there was light.”

These were the words spoken by the astronauts on board Apollo 8 once they had established a lunar orbit, thereby becoming the first humans ever to leave Earth fully behind them. As a literary choice, it was one that inevitably caused considerable irritation, and especially amongst atheists around the world.

Undoubtedly there was more than a little politics involved when it came to the Apollo astronauts making a decision to read passages from the Bible. Given that the Cold War face-off had provided such impetus for the entire space programme, having steadily beaten off the challenge of the godless Soviets, if nothing else the words transmitted a kind undiplomatic rebuke, redoubled when the Eagle landing module touched down just a few months later, and the astronauts’ first duty became to plant the Stars and Stripes in the pristine moon dust. Skipping about in delight, taking some holiday snaps and bringing home a basket full of moon-rocks no longer enough.

Not that I am trying to rain on anyone’s parade. Far from it. The Moon landing involving not merely a tremendous technical achievement but also a hell of a lot of guts. It was one moment when ordinary Americans had every reason to feel pride. Viewed in an alternative light, however, this towering and singular accomplishment was also the extraordinary end product of many centuries of truly international effort. A high point in a centuries long science and engineering project set in motion by pioneers like Galileo, Kepler and, of course, Newton, which only then culminated on July 20th 1969 with such a genuinely epoch-marking event that for many minutes the world collectively held its breath … 1

Apollo 8 had been just another of the more important reconnaissance missions necessary to lay the groundwork for the moon landing itself. Another small step that led directly to that most famous step in history (so far), although as a step, the Apollo 8 mission was also breathtaking in its own right. As for the grumbling about the transmission of passages from Genesis, well the inclusion of any kind of religious element seemed inappropriate to many. After all, science and religion are not supposed to mix, but on top of which, having gained seeming ascendancy why was Science suddenly playing second fiddle again?

Religion, as a great many of its opponents readily point out, is superstition writ largest. Science, by contrast, purposefully renounces the darkness of superstition, and operates solely by virtue of the assiduous application of logic and reason. Science and religion are therefore as incompatible as night and day, and so when it came to the cutting edge of space exploration, just what did the Bible have to do with any of it? Sir Isaac Newton was doing the driving, wasn’t he…?

On the other hand, and playing Devil’s Advocate, why not choose these words? After all, the circumstances rendered a strange appropriateness and charge to the plain vocabulary of Genesis: heaven and earth; void and darkness; the face of the waters. A description of the act of creation so understated, and yet evocative, that it’s hard to recall a more memorable paragraph in the whole literary canon, and few with greater economy. If the astronauts or NASA were endorsing the biblical story of creation that would have been another matter, of course, but here I think we can forgive the perceived faux pas – ‘one false step amidst a giant leap forward for mankind!’

My personal wish is that as Neil and Buzz were setting off to “where no man had gone before”, climbing into their Lunar Landing Module and sealing the air-lock behind them, they might forgetfully have left the flag behind to keep Michael Collins company. Leaving no signs of their extraordinary visit besides the landing section of the strange metal beetle they had flown in, and, beside it, their monumental, and somehow still astonishing, footprints.

*

Very occasionally I happen to meet intelligent and otherwise rational people who’ll made the claim that the biblical story of creation is broadly supported by the latest scientific discoveries. The universe began at a moment, they’ll explain, just as it is written. There then followed a succession of events, leading to the eventual rise of Man. All of this, they’ll insist, accurately checks out with the opening page of Genesis, whilst the theories of modern cosmology and evolutionary biology simply patch the occasional missing details. And truly, this is a desperate line of defence!

For there is no amount of creative Biblical accountancy – of interpreting days as epochs and so forth – which can successfully reconstruct the myth of Genesis in order to make it scientifically sound. The world just wasn’t created that way – wasn’t created at all, apparently – and creationism, which often claims to be an alternative theory, when it offers no theory at all, also fails to withstand the minutest degree of scrutiny. No, creationism survives merely on account of the blind and desperate faith of its adherents. Here indeed is how a modern cosmologist might have gone about rewriting the Biblical version (if by chance they had been on hand to lend God a little assistance):

“In the beginning God created a small but intense fireball. A universal atom into which space and time itself were intrinsically wrapped. As this primordial fireball very rapidly expanded and cooled, the fundamental particles of matter condensed out of its energetic froth, and by coalescence, formed into atoms of hydrogen, helium and lithium. All this passed in a few minutes.

Clouds of those original elements, collapsing under their own weight, then formed into the first stars. The loss of gravitational potential energy heating the gases in these proto-stars to sufficiently high temperatures (many millions of degrees) to trigger nuclear fusion. In the cores of such early giants, the atoms of hydrogen and helium were now just beginning to be fused into ever-heavier elements through a series of stages known as nucleosynthesis. Happily this fusion of smaller atoms into increasingly larger ones generated an abundance of energy. Enough to keep the core temperature of each star above a million degrees; hot enough to sustain the fusion of more and more atoms. So it was that the hydrogen begat helium, helium begat lithium, lithium begat beryllium and boron… And God saw that it was good.

After a few billion years had passed, these same stars, which had hitherto been in a state of hydrostatic balance – thermal and radiation pressure 2 together supporting the weight of the gases – were burning low on fuel. During this last stage, at the end of a long chain of exergonic 3 fusion reactions, atoms as large as iron were being created for the very first time. Beyond the production of iron, however, this nucleosynthesis into even heavier elements becomes energy exhaustive, and so the process of fusion could no longer remain self-sustaining. So it came to pass that the first generation stars were starting to die.

But these stars were not about to fizzle out like so many guttering candles. The final stage of their demise involved not a whimper, but bangs of unimaginable power. Beginning as a collapse, an accelerating collapse that would inevitably and catastrophically rebound, each star was torn apart within a few seconds, the remnants propelled at hyper-velocities out into deep space. And it was during these brief but almighty supernova explosions when the heavier elements (lead, gold and ultimately all the stable elements in the periodic table) came into being.

Ages came and passed. Pockets of the supernova debris, now drifting about in tenuous clouds, and enriched with those heavier elements, began to coalesce a second time: the influence of gravity rolling the dust into new stars. Our Sun is one star born not from that generation, but the next, being one of almost countless numbers of third generation of stars: our entire Solar System emerging indeed from a twice-processed aggregation of swirling supernova debris. All this had passed around 5 billion years ago, approximately 14 billion years after the birth of time itself.”

Now very obviously in this modern reworking there can be no Earth at the time of creation, so the story in Genesis fails to accord with the science right from its outset: from chapter one, verse one. For there is simply no room for the Earth when the whole universe is still smaller than a grapefruit.

I can already hear the protests of course: for Earth we must read Universe apparently, in order to make any meaningful comparison. Okay, so playing along, what then becomes of heaven? For God created both heaven and earth remember. Well, if heaven was once some place above our heads (as it surely was for people living under the stars at the time when Genesis was written) then to accord with the current theories of cosmology, perhaps those who still subscribe to the entire Biblical story imagine its existence as a parallel universe; linked through a wormhole we call death. Truly, the Lord works in mysterious ways!

 *

Some readers will doubtless baulk at the idea of God being the creator of anything, and yet I think we should honestly admit that nothing in modern cosmology with certainty precludes the existence of an original creative force; of God only as the primum mobile, the first-mover, igniting the primordial spark. Indeed, it may come as a surprise to discover (as it did for me) that one of the first proponents of the currently accepted scientific theory – now universally known as the Big Bang Theory – was by vocation a Roman Catholic priest.

Father Georges Lemaître, a Belgian professor of physics and astronomy, having quickly recognised the cosmological possibilities latent within Einstein’s then still novel theory of General Relativity, published his ‘hypothesis of the primeval atom’ in the prestigious scientific journal Nature as long ago as 1931. Yet interestingly, his ideas did not receive much support at the time, in part due to lack of evidence, but also because many contemporary physicists initially rejected all such theories of spontaneous universal origin as being an entirely religious import. But science isn’t built on belief, and so it can’t be held hostage to orthodoxy in the same way that religious conviction can. This is where science and religion absolutely depart. Although, in order to explore this further, it is first helpful to consider two vitally important though rather difficult questions: “what is science?” and “what is religion?”

*

I have a friend who tells me that science is the search for knowledge; an idea that fits very happily with the word’s etymology: from Latin scientia, meaning “to know”. Meanwhile the dictionary itself offers another useful definition: “a branch of knowledge conducted on objective principles involving the systematized observation of, and experiment with phenomena.” According to this more complete description, science is not any particular set of knowledge, but rather a system or systems that aim at objectivity.

Scientific facts exist, of course, but these are simply ideas that have been proved irrefutable. For instance, that the Earth is a ball that moves around the Sun. This is a fact and it is a scientific one. For the most part, however, scientists do not work with facts as straightforward as this. And rather than facts, the most common currency of working scientists is theories. Scientific theories are not to be believed in as such, but a means to encompass the best understanding available. They exist in order to be challenged, and thus to be improved upon.

In Science, belief begins and ends as follows: that some forms of investigation, by virtue of being objective, lead to better solutions than other, less objective approaches. This is the only orthodoxy to which all scientists are committed, and so, in the final analysis, being scientific means nothing more or less than an implicit refusal to admit knowledge aside from what can be observed and measured. For Science is an inherently empirical approach, with its prime directive and perhaps also its élan vital being that: in testing, we trust.

I could leave the question of science right there and move on to consider the question of religion, but before I do so, I would like to put one important matter straight. Whatever it is that science is and does, it also helps to understand that the majority of scientists rarely if ever consider this question.

As a physics undergraduate myself, I learnt quite literally nothing about the underlying philosophies of science (there was an addition module – a final year option – addressing this topic but unfortunately it was oversubscribed). Aside from this, I was never taught to analyse the empirical method in and of itself. I personally learnt absolutely nothing about hypotheses, let alone how to test them (and in case this should lead readers to think my university education was itself substandard, then let me also admit, at the risk of appearing an arrogant braggart, that I attended one of the best scientific academies in the country – Imperial College would no doubt say the best). Yet they did not teach us about hypotheses, and for the simple reason that the vast majority of physicists rarely bother their heads about them. Instead, the scientists I’ve known (and again, I was a research student for three years) do what might be broadly termed “investigations”.

An investigation is just that, and it might involve any variety of techniques and approaches. During the most exciting stages of the work, the adept scientist may indeed rely as much on guesswork and intuition as on academic training and logical reasoning. Famously, for example, the chemist August Kekulé dreamt up the structure of benzene in his sleep. Proving the dream correct obviously required a bit more work.

The task set for every research scientist is to find answers. Typically then, scientists are inclined to look upon the world as if it were a puzzle (the best puzzle available), and as with any other puzzle, the point is just to find a satisfactory solution.

So why then did I begin with talk of scientific methods? Well because, as with most puzzles, some methods will prove more efficacious than others, but also because in this case there is no answer to be found at the bottom of page 30 – so we’d better be as sure as we can, that the answer we find is the best available one. Which in turn means applying the best (i.e., most appropriate and reliable) methods at hand, or else developing still better ones.

By ‘method’, I do not mean simply whatever approach the scientist employs to test his or her own guesses about the puzzle, but just as importantly, a system that can be used to prove this solution to the satisfaction of a wider scientific community. For methods too are accepted only once they have been tried and tested.

So when the philosopher Karl Popper claims that the scientific method depends upon “testable hypotheses” (or as my friend calls them “detestable hypotheses”) I would say fair enough… but let’s not mistake this definition for a description of what scientists actually do. We may accept that science must make statements that can be falsified – this is indeed a useful “line of demarcation”, as Popper puts it 4 – and we can call these statements “testable hypotheses” if we choose – but science is simply about broadening and refining our knowledge and understanding, and any approach that is scientifically accountable will really do just fine.

*

So what of religion? Well, that’s a pricklier issue again, of course, so let me swerve clear of any direct answer for the moment so as to draw a further comparison with science.

Where a religious person may say, I have faith in such and such because it is written so, a scientist, assuming she is honest, ought only to say that “given the evidence we have thus far collected and collated, our best explanation is the following…” As more evidence becomes available, our scientist, assuming she has integrity (at least as a scientist), may humbly (or not) concur that her previously accepted best explanation is no longer satisfactory. In short, the scientist is always required by virtue of their profession to keep an open mind; the truth of their discipline being something that’s forever unfolding and producing facts that are rarely final.

For the religious-minded, however, the very opposite may apply, and for all who know that the true shape of things is already revealed to them through faith, there must be absolute restrictions to further open-minded inquiry. (Not that all religions stress the importance of such unassailable beliefs – some do not.)

Where it is the duty of every scientist to accept all genuine challenges, and to allow (as Richard Feynman once put it) for Nature to come out “the way she is”, it is the duty for many religious believers (though not, as I say, of all who are religious) to maintain a more rigidly fixed view of the world. Here again, however, it ought to be stressed that the scientist’s constant and single-minded aim for objectivity is not necessarily dependent on his or her lack of beliefs or subjective opinion – scientists are, after all, only human. So virtually all scientists come to their puzzles with preconceived hunches, and, whether determined by the head or the heart, have a preference for one solution over another. But this doesn’t much matter, so long as they are rigorous in their science.

Indeed, many of the most brilliant scientific minds have also held strongly religious convictions (Newton and Darwin spring immediately to mind). In studying that great work called Nature, Newton was implicitly trying to understand the mind of God, and finally Newton’s discoveries did not shatter his belief in God, but instead confirmed for him that there must be an intelligent agency at large, or at least one that set things initially in motion.             Darwin’s faith was more fundamentally rocked (as we shall see), yet he came to study Nature as another devout believer. But the art of the scientist in every case is to recognise such prejudices and put them to one side, and this is the original reason for developing such strict and rigorous methodologies. Ultimately, to reiterate, Science is no more or less powerful than its own methods for inquiry. Which is how it was that physicists and astronomers gradually put aside their reservations as the evidence grew in favour of Father Lemaître’s theory of creation.

So the lesson here is that whereas religion demands faith, science asks always for the allowance of doubt and uncertainty. And just as St Thomas asked to see the holes in Christ’s palms, so too every responsible scientist is called to do the same, day in and day out. Doubting Thomas should be a patron saint of all scientists.

*

I wish to change the subject. It is not my aim to pitch science against religion and pretend that science is somehow the victor, when in truth I regard this as a phoney war. On its own territory, which is within the bounds of what is observable and measurable, science must always win. This is inevitable. Those who still look for answers to scientific questions in the ancient writings of holy men are only deceiving themselves.

But science too has its boundaries, and, as the philosopher Ludwig Wittgenstein argued in his famous (if notoriously difficult) Tractatus Logico-Philosophicus – proceeding via an interwoven sequence of numbered and nested propositions and aphorisms to systematically unravel the complex relationship between language, thought and the world – rational inquiry, though our most promising guide for uncovering the facts of existence, can never be complete.

Just as the Universe apparently won’t allow us to capture every last drop of heat energy and make it do work for us, at least according to current thermodynamic theories, so Wittgenstein argued (to his own satisfaction and also to the exacting standards of Bertrand Russell) an analogous limitation applies to all systems of enquiry designed for capturing truth. Even the most elaborate engines in the world cannot be made 100% efficient, and likewise the most carefully constructed forms of philosophical investigation, even accepting Science as the most magnificent philosophical truth engine we shall ever devise (as Wittgenstein did 5), will inescapably be limited to that same extent – perfection in both cases being simply unattainable.

Many have racked their brains to think up the most cunning of contraptions, but none have invented a perpetual motion machine, and the same, according to Wittgenstein, goes for anyone wishing to generate any comprehensive theory of everything, which is just another human fantasy. 6 Most significantly and most controversially, Wittgenstein says that no method can be devised for securing any certain truths regarding ethics, aesthetics, or metaphysics, and that consequently all attempts at pure and detached philosophical talk of these vital matters is mere sophistry.

Having revealed the ultimate limitations to reasoning, Wittgenstein then arrives at his seventh, and perhaps most famous proposition in this most famous and celebrated of works. A stand-alone declaration: it is the metaphorical equivalent of slamming the book shut!

“What we cannot speak of we must pass over in silence.” 7, he says, suddenly permitting himself the licence of a poet.

This was his first and also last hurrah as a philosopher (or so he thought 8), Wittgenstein taking the lead from his own writings – and what greater measure of integrity for a philosopher than to live according to their own espoused principles. Ditching his blossoming career at Cambridge, he set out in pursuit of a simpler life back in his Austrian homeland, first (and somewhat disastrously) as a primary school teacher, and then more humbly as a gardener at a monastery. (Although at length, of course, he did famously return to Cambridge to resume and extend his “philosophical investigations”).

But isn’t this all just a redressing of much earlier ideas of scepticism? Well, Wittgenstein is quick to distance himself from such negative doctrines, for he was certainly not denying truth in all regards (and never would). But faced by our insurmountable limitations to knowledge, Wittgenstein is instead asking those who discuss philosophies beyond the natural sciences to intellectually pipe-down. Perhaps he speaks too boldly (some would say too arrogantly). Maybe he’s just missing the point that others more talented would have grasped, then stomping off in a huff. After all, he eventually turned tail in 1929, picking up where he’d left off in Cambridge, returning in part to criticise his own stumbling first attempt. But then what in philosophy was ever perfectly watertight?

The one thing he was constantly at pains to point out: that all philosophy is an activity and not, as others had believed, the golden road to any lasting doctrinal end. 9  And it’s not that Wittgenstein was really stamping his feet and saying “impossible!”, but rather that he was attempting to draw some necessary and useful boundaries. Trying to stake out where claims to philosophic truth legitimately begin and end. An enterprise perhaps most relevant to the natural sciences, an arena of especially precise investigation, and one where Wittgenstein’s guiding principle – that anything which can be usefully said may be said clearly or not at all – can be held as a fair measure against all theories. Indeed, I believe this insistence upon clarity provides a litmus test for claims of “scientific objectivity” from every field.

Embedded above is a film by Christopher Nupen entitled “The Language Of The New Music” about Ludwig Wittgenstein and Arnold Schonberg; two men whose lives and ideas run parallel in the development of Viennese radicalism. Both men emerged from the turmoil of the Habsburg Empire in its closing days with the idea of analysing language and purging it with critical intent, believing that in the analysis and purification of language lies the greatest hope that we have.

*

Let me return to the question of religion itself, not to inquire further into “what it is” (since religion takes many and varied forms, the nature of which we may return to later), but rather to ask more pragmatically “whether or not we are better with or without it”, in whatever form. A great many thinkers past and present have toyed with this question; a considerable few finding grounds to answer with a very resounding “without”.

In current times there has been no more outspoken advocate of banishing all religion than the biologist Richard Dawkins. Dawkins, who aside from being a scientist of unquestionable ability and achievement, is also an artful and lively writer; his books on neo-Darwinian evolutionary theory being just as clear and precise as they are wonderfully detailed and inspiring. He allows Nature to shine forth with her own brilliance, though never shirking descriptions of her darker ways. I’m very happy to say that I’ve learnt a great deal from reading Dawkins’ books and am grateful to him for that.

In his most famous (although by no means his best) book, The Selfish Gene, Dawkins set out to uncover the arena wherein the evolution of life is ultimately played out. After carefully considering a variety of hypotheses including competition between species, or the rivalry within groups and between individuals, he concludes that in all cases the real drama takes place at a lower, altogether more foundational level. Evolution, he explains, after a great deal of scrupulous evidential analysis, is driven by competition between fragments of DNA called genes, and these blind molecules care not one jot about anything or anyone. This is why the eponymous gene is so selfish (and Dawkins may perhaps have chosen his title a little more carefully, since those who haven’t read beyond the cover may wrongly presume that scientists have discovered the gene for selfishness, which is most certainly not the case). But I would like to save any further discussion about theories of biological evolution, and of how these have shaped our understanding of what life is (and hence what we are), for later chapters. Here instead I want to briefly consider Dawkins’ idea not of genes but “memes”.

*

In human society, Dawkins says in his final chapter of The Selfish Gene, change is effected far more rapidly by shifts in ideas rather than by those more steady shifts in our biology. So in order to understand our later development, he presents the notion of the parallel evolution between kinds of primal idea-fragments, which he calls “memes”. Memes that are most successful (i.e., the most widely promulgated) will, says Dawkins, like genes, possess particular qualities that increase their chances for survival and reproduction. In this case, memes that say “I am true so tell others” or more dangerously “destroy any opposition to my essential truth” are likely to do especially well in the overall field of competition. Indeed, says Dawkins, these sorts of memes have already spread and infected us like viruses.

For Dawkins, religious beliefs are some of the best examples of these successful selfish memes, persisting not because of any inherent truth, but simply because they have become wonderfully adapted for survival and transmission. His idea (a meme itself presumably) certainly isn’t hard science – in fact it’s all rather hand-waving stuff – but as a vaguely hand-waving response I’d have to admit that he has a partial point. Ideas that encourage self-satisfied proselytising are often spread more virulently than similar ideas that do not. Yet ideas also spread because they are just frankly better ideas, so how can Dawkins’ theory of memes bring this more positive reason into account? Can his same idea explain, for instance, why the ideas of science and liberal humanism have also spread so far and wide? Aren’t these merely other kinds of successful meme that have no special privilege above memes that encourage sun worship and blood sacrifice?

My feeling here is that Dawkins comes from the wrong direction. There is no rigorous theory for the evolution of memes, nor can there be, since there is no clearly discernible, let alone universal mechanism, behind the variation and selection of ideas. But then of course Dawkins knows this perfectly well and never attempts to make a serious case. So why does he mention memes at all?

Well, as an atheistic materialist, he obviously already knows the answer he wants. So this faux-theory of memes is just his damnedest attempt to ensure such a right result. Religion operating as a virus is an explanation that plainly satisfies him, and whilst his route to discovering that answer depends on altogether shaky methodology, he puts aside his otherwise impeccable scientific principles, and being driven to prove what in truth he only feels, he spins a theory backwards from a prejudice. What Dawkins and others have perhaps failed to recognise is that in the fullest sense, questions of religion – of why we are here, of why we suffer, of what makes a good life – will never be cracked by the sledgehammer of reason, for questions of value lie outside the bounds of scientific analysis. Or if he does recognise this, then the failing instead is to understand that there are many, quite different in temperament, who will always need attempted answers to these profound questions.

*

I didn’t grow up in a particularly religious environment. My mother had attended Sunday school, and there she’d learnt to trust in the idea of heaven and the eternal hereafter. It wasn’t hell-fire stuff and she was perfectly happy to keep her faith private. My father was more agnostic. He would probably now tell you that he was always an atheist but, in truth, and like many good atheists, he was actually an agnostic. The test of this is simple enough: the fact that he quite often admitted how nice it would be to have faith in something, although his own belief was just that Jesus was a good bloke and the world would be much nicer if people to tried to emulate him a bit. (Which is a Christian heresy, of course!)

I was lucky enough to attend a small primary school in a sleepy Shropshire village. Although it was a church school of sorts, religious instruction involved nothing more than the occasional edifying parable, various hymns, ancient and modern, and the Lord’s Prayer mumbled daily at the end of assembly. Not exactly what you’d call indoctrination. At secondary school, religious instruction became more formalised – one hour each week, presumably to satisfy state legislature. Then, as the years went by, our lessons in R.E. shifted from a purely Christian syllabus to one with more multicultural aspirations. So we learnt about Judaism, Islam, and even Sikhism, although thinking back I feel sure that our teacher must have delivered such alternative lessons through gritted teeth. I recall once how a classmate confused the creature on top of a Christmas tree with a fairy. Hark, how you should have heard her!

Being rather devout, this same teacher – a young, highly-strung, and staunchly virginal spinster – also set up a Christian Union club that she ran during the lunch hour, and for some reason I joined up. Perhaps it had to do with a school-friend telling me about Pascal’s wager: that you might as well believe in God since you stand to gain so much for the price of so small a stake. In any case, for a few weeks or months I tried to believe, or at least tried to discover precisely what it was that I was supposed to be believing in, though I quickly gave up. Indeed, the whole process actually made me hostile to religion. So for a time I would actively curse the God in the sky – test him out a bit – which proves only that I was believing in something.

Well to cut a long story short, whatever strain of religion I’d contracted, it was something that did affect me to a considerable extent in my late teens and early twenties. Of course, by then I regarded myself a fervent atheist, having concluded that “the big man in the sky” was nothing more or less than an ugly cultural artifact, something alien, someone else’s figment planted in my own imagination… and yet still I found that I had this God twitch.              Occasionally, and especially for some reason whilst on long journeys driving the car, I’d find myself ruminating on the possibility of his all-seeing eyes watching over me. So, by and by, I decided to make a totally conscious effort to free myself from this mind-patrolling spectre, snuffing out all thought of God whenever it arose. To pay no heed to it. And little by little the thought died off. God was dead, or at least a stupid idea of God, a graven image, and one I’d contracted in spite of such mild exposure to Christian teachings. A mind-shackle that was really no different from my many other contracted neuroses. Well, as I slowly expunged this chimera, I discovered another way to think about religion, although I hesitate to use such a grubby word – but what’s the choice?

Spirituality – yuck! It smacks of a cowardly cop-out to apply such a slippery alternative. A weasel word. A euphemism almost, to divert attention from mistakes of religions past and present. But are there any more tasteful alternatives? And likewise – though God is just such an unspeakably filthy word (especially when He bears an upper case G like a crown), what synonym can serve the same purpose? You see how difficult it is to talk of such things when much of the available vocabulary offends (and for some reason we encounter similar problems talking about death, defecation, sex and a hundred other things, though principally death, defecation and sex). So allow me to pass the baton to the greatly overlooked genius of William James, who had a far greater mastery over words than myself, and is a most elegant author on matters of the metaphysical.

*

“There is a notion in the air about us that religion is probably only an anachronism, a case of ‘survival’, an atavistic relapse into a mode of thought which humanity in its more enlightened examples has outgrown; and this notion our religious anthropologists at present do little to counteract. This view is so widespread at the present day that I must consider it with some explicitness before I pass to my own conclusions. Let me call it the ‘Survival theory’, for brevity’s sake.” 10

Here is James steadying himself before addressing his conclusions regarding The Varieties of Religious Experience. The twentieth century has just turned. Marx and Freud are beginning to call the tunes: Science, more broadly, in the ascendant. But I shall return to these themes later in the book, restricting myself here to James’ very cautiously considered inquiries into the nature of religion itself and why it can never be adequately replaced by scientific objectivity alone. He begins by comparing the religious outlook to the scientific outlook and by considering the differences between each:

The pivot round which the religious life, as we have traced it, revolves, is the interest of the individual in his private personal destiny. Religion, in short, is a monumental chapter in the history of human egotism… Science on the other hand, has ended by utterly repudiating the personal point of view. She catalogues her elements and records her laws indifferent as to what purpose may be shown by them, and constructs her theories quite careless of their bearing on human anxieties and fates… 11

This is such a significant disagreement, James argues, that it is easy to sympathise with the more objective approach guaranteed by hard-edged precision of science, and to dismiss religious attitudes altogether:

You see how natural it is, from this point of view, to treat religion as mere survival, for religion does in fact perpetuate the traditions of the most primeval thought. To coerce the spiritual powers, or to square them and get them on our side, was, during enormous tracts of time, the one great object in our dealings with the natural world. For our ancestors, dreams, hallucinations, revelations, and cock-and-bull stories were inextricably mixed with facts… How indeed could it be otherwise? The extraordinary value, for explanation and prevision, of those mathematical and mechanical modes of conception which science uses, was a result that could not possibly have been expected in advance. Weight, movement, velocity, direction, position, what thin, pallid, uninteresting ideas! How could the richer animistic aspects of Nature, the peculiarities and oddities that make phenomena picturesquely striking or expressive, fail to have been singled out and followed by philosophy as the more promising avenue to the knowledge of Nature’s life. 12

As true heirs to the scientific enlightenment, we are asked to abandon such primeval imaginings and, by a process of deanthropomorphization (to use James’ own deliberately cumbersome term), which focuses only on the precisely defined properties of the phenomenal world so carefully delineated by science, sever the private from the cosmic. James argues, however, that such enlightenment comes at a cost:

So long as we deal with the cosmic and the general, we deal only with the symbols of reality, but as soon as we deal with private and personal phenomena as such, we deal with realities in the completest sense of the term. 13

Thus, to entirely regard one’s life through the pure and impersonal lens of scientific inquiry is to see through a glass, not so much too darkly, as too impartially. Whilst being expected to leave out from our descriptions of the world “all the various feelings of the individual pinch of destiny, [and] all the various spiritual attitudes”, James compares with being offered “a printed bill of fare as the equivalent for a solid meal.” He expresses the point most succinctly saying:

It is impossible, in the present temper of the scientific imagination, to find in the driftings of cosmic atoms, whether they work on the universal or on the particular scale, anything but aimless weather, doing and undoing, achieving no proper history, and leaving no result.

This is the heart of the matter, and the reason James surmises, quite correctly in my opinion:

… That religion, occupying herself with personal destinies and keeping thus in contact with the only absolute realities which we know, must necessarily play an eternal part in human history. 14

*

Mauro Bergonzi, Professor of Religion and Philosophy in Naples, speaks about the utter simplicity of what is:

*

“I gotta tell you the truth folks,” comedian George Carlin says at the start of his most famous and entertaining rant, “I gotta tell you the truth. When it comes to bullshit – big-time, major league bullshit! You have to stand in awe of the all-time champion of false promises and exaggerated claims: Religion! Think about it! Religion has actually convinced people that there’s an invisible man! – living in the sky! –  who watches everything you do, every minute of every day…”

And he’s right. It’s bonkers but it’s true, and Carlin is simply reporting what many millions of people very piously believe. Sure, plenty of Christians, Muslims and Jews hold a more nuanced faith in their one God, and yet for vast multitudes of believers, this same God is nothing but a bigger, more powerful, humanoid. A father figure.

“Man created God in his own image,” is the way a friend once put it to me. And as a big man, this kind of a God inevitably has a big man’s needs.

Of course, the gods of most, if not all, traditions have been in the business of demanding offerings of one kind or another to be sacrificed before them, for what else are gods supposed to receive in way of remuneration for their services? It’s hardly surprising then that all three of the great Abrahamic faiths turn sacrifice into a central theme. But then what sacrifice can ever be enough for the one-and-only God who already has everything? Well, as George Carlin points out, God is generally on the lookout for cash:

“He’s all-powerful, all-perfect, all-knowing and all-wise, but somehow just can’t handle money!” But still, cash only goes so far. Greater sacrifices are also required, and, as the Old Testament story of Abraham and Isaac makes abundantly clear, on some occasions nothing less than human blood-sacrifice will do. 15 The implicit lesson of this story being that the love of our Lord God requires absolute obedience, nothing less. For ours is not to reason why…

“Oh, God you are so big!” the Monty Python prayer begins – bigness being reason enough to be awed into submission. But God also wants our devotion, and then more than this, he wants our love to be unconditional and undiluted. In short, he wants our immortal souls, even if for the meantime, he’ll settle for other lesser sacrifices in lieu.

As for the more caring Christian God (the OT God restyled), well here the idea of sacrifice is up-turned. The agonising death of his own son on Golgotha apparently satisfying enough to spare the rest of us. It’s an interesting twist, even if the idea of a sacrificed king is far from novel; by dividing his former wholeness, and then sacrificing one part of himself to secure the eternal favour of his other half is a neat trick.

But still, why the requirement for such a bloody sacrifice at all? Well, is it not inevitable that every almighty Lord of Creation must sooner or later get mixed up with the God of Death? For what in nature is more unassailable than Death; the most fearsome destroyer who ultimately smites all. Somehow this God Almighty must have control over everything and that obviously includes Death.

“The ‘omnipotent’ and ‘omniscient’ God of theology,” James once wrote in a letter, “I regard as a disease of the philosophy shop.” And here again I wholeheartedly agree with James. Why…? For all the reasons given above, and, perhaps more importantly, because any “one and only” infinitist belief cannot stand the test at all. Allow me to elucidate.

The world is full of evils; some of these are the evils of mankind, but certainly not all. So what sort of a God created amoebic dysentery, bowel cancer and the Ebola virus? And what God would allow the agonies of his floods, famines, earthquakes, fires and all his other wondrously conceived natural disasters? What God would design a universe of such suffering that he invented the parasitic wasps that sting their caterpillar hosts to leave them paralysed, laying their eggs inside so that their grubs will eat the living flesh?

The trouble is that any One True Lord, presuming this Lord is also of infinite goodness, needs, by necessity, a Devil to do his earthly bidding. This is unavoidable because without an evil counterpart such an infinite and omnipotent God, by virtue of holding absolute power over all creation, must thereby permit every evil in this world, whether man-made or entirely natural in origin. And though we may of course accept that human cruelties are a necessary part of the bargain for God’s gift of freewill – which is a questionable point in itself – we are still left to account for such evils as exist beyond the limited control of our species.

Thus, to escape the problem of blaming such “acts of God” on God himself, we may choose to blame the Devil instead for all our woes, yet this leads inexorably to an insoluble dilemma. For if the Devil is a wholly distinct and self-sustaining force we have simply divided God into two opposing halves (when He must be One), whereas if we accept that this Devil is just another of the many works of the One God, then the problem never really went away in the first place. For why would any omnipotent God first create and then permit the Devil to go about in his own evil ways? It is perhaps Epicurus who puts this whole matter most succinctly:

Is God willing to prevent evil, but not able? Then he is not omnipotent. Is he able, but not willing? Then he is malevolent. Is he both able and willing? Then whence cometh evil? Is he neither able nor willing? Then why call him God? 16

It is here that we enter the thorny theological “problem of evil”, although it might equally fittingly be called the “problem of pain”, for without pain, in all its various colourations, it is hard to imagine what actual form the evil itself could take.

So confronted by the Almighty One, we might very respectfully ask, “why pain?” Or if not why pain, as such, for conceivably this God may retort that without pain we would not appreciate joy, just as we could not measure the glory of day without the darkness of night, we still might ask: but why such excessive pain, and why so arbitrarily inflicted? For what level of ecstasy can ever justify all of Nature’s cruelties?

At this point, James unceremoniously severs the Gordian knot as follows: “… the only obvious escape from paradox here is to cut loose from the monistic assumption altogether, and allow the world to have existed from its origin in pluralistic form, as an aggregate or collection of higher and lower things and principles, rather than an absolutely unitary fact. For then evil would not need to be essential; it might be, and it may always have been, an independent portion that had no rational or absolute right to live with the rest, and which we might conceivably hope to see got rid of at last…”

*

There are many who have set out to find proof of God’s existence. Some have looked for evidence in archeology – the sunken cities of Sodom and Gomorrah, the preserved remains of Noah’s Ark, and most famously, the carbon dating of the Shroud of Turin – but again and again the trails lead cold. Others turned inwards. Searching for proof of God through reason. But this is surely the oldest mistake in the book. For whatever God could ever be proved by reason would undoubtedly shrivel up into a pointless kind of a God.

But there is also a comparable mistake to be made. It is repeated by all who still try, and after so many attempts have failed, to absolutely refute God’s existence. For God, even the Judeo-Christian-Islamic God, can in some more elusive sense, remain subtle enough to slip all the nets. He need not maintain the form of the big man in the sky, but can diffuse into an altogether more mysterious form of cosmic consciousness. In this more mystical form, with its emphasis on immediate apprehension, history also sinks into the background.

Dawkins and others who adhere to a strictly anti-religious view of the world are in the habit of disregarding these more subtle and tolerant religious attitudes. Fashioning arguments that whip up indignation in their largely irreligious audience, they focus on the rigid doctrines of fundamentalists. And obviously, they will never shake the pig-headed faith of such fundamentalists, but then neither will their appeals to scientific rationalism deflect many from holding more flexible and considered religious viewpoints. The reason for this is simple enough: that man (or, at least, most people) cannot live by bread alone. So, for the genuinely agnostic inquirer, strict atheism provides only an unsatisfactory existential escape hatch.

In the year 2000, the world-renowned theoretical physicist and mathematician Freeman Dyson won the Templeton Prize for Progress in Religion 17. In his acceptance speech he staked out the rightful position of religion as follows:

I am content to be one of the multitude of Christians who do not care much about the doctrine of the Trinity or the historical truth of the gospels. Both as a scientist and as a religious person, I am accustomed to living with uncertainty. Science is exciting because it is full of unsolved mysteries, and religion is exciting for the same reason. The greatest unsolved mysteries are the mysteries of our existence as conscious beings in a small corner of a vast universe. Why are we here? Does the universe have a purpose? Whence comes our knowledge of good and evil? These mysteries, and a hundred others like them, are beyond the reach of science. They lie on the other side of the border, within the jurisdiction of religion.

So the origins of science and religion are the same; he says, adding a little later:

Science and religion are two windows that people look through, trying to understand the big universe outside, trying to understand why we are here. The two windows give different views, but they look out at the same universe. Both views are one-sided; neither is complete. Both leave out essential features of the real world. And both are worthy of respect.

Trouble arises when either science or religion claims universal jurisdiction, when either religious dogma or scientific dogma claims to be infallible. Religious creationists and scientific materialists are equally dogmatic and insensitive. By their arrogance they bring both science and religion into disrepute. 18

By restoring mystery to its proper place at the centre of our lives, Dyson’s uncertainty might indeed offer the possibility for actual religious progress. It might achieve something that the purer atheism almost certainly never will. Hallelujah and amen!

*

Once upon a time I was an atheist too, only slowly coming to realise that being so sure-footed about the inessential non-spirituality of existence requires an element of faith of its own. It requires a faith in the ultimate non-mystery of the material universe. That everything is, in principle at least, fathomable. Not that this means our atheistic scientific worldview must inevitably be duller nor that it automatically considers life less wonderful. Not at all. Life and the rest of it may appear to be just as aimless as weather, to steal James’ choice metaphor, but this has a kind of beauty of its own, as many an atheist will affirm. And there’s security of a different, some would say higher form, in the acceptance and affirmation of perfectly aimless existence. It can feel like a weight lifted.

Yet, the rarely admitted truth is that the carriers of the scientific light of reason (of whom I remain very much one) are just as uncertain as the average Joe Churchgoer about what might loosely be termed the supernatural (or supranatural) – by which I mean both the ultimately unknowable, and also, whatever strange and various events still remain unexplained by our accepted laws of the natural world. All of which stands to reason: the inexplicable lying, by its very definition, outside the province of science, whilst, at the same time, a bristling realisation that the universe is inherently and intractably mysterious stirs unconsciously at the back of all our minds, even those of the most logical and rational of thinkers. For the stark truth is that existence itself is spooky! And consequently, scientists too are sometimes afraid of the dark.

Finally then, the practising scientist, putting aside all questions of ultimate meaning or purpose, for these concerns are beyond the scope of their professional inquiries, must admit that they sideline such matters only on the grounds of expedience. The only useful scientific questions being ones that can be meticulously framed. So whilst science is necessarily dispassionate and preoccupied with material facts, it does not follow that being scientific means to mistake the world as revealed by science for the scientific model that approximates it – any model of the universe being, at best, obviously a pale approximation to the true complexity of the original.

Scientists then are not the new high priests and priestesses of our times, because their role is cast quite differently. Gazing downwards rather than upwards, to earth rather than heaven, they pick away at the apparently lesser details in the hope of unravelling the bigger picture. Turning outwards instead of inwards, deliberately avoiding subjective interpretations in favour of tests and measurements, they seek to avoid opinion and to rise above prejudice. All of this requires a kind of modesty, or should.

But there is also a fake religion, one that dresses itself in the brilliant white of laboratory coats. It pleads that the only true way to understanding is a scientific one, disavowing all alternatives to its own rational authority. Of course such claims to absolute authority are no less fraudulent than claims of papal infallibility or the divine right of kings, but true devotees to the new religion are blind to such comparisons. More importantly, they fail to see that all claims to an exclusive understanding, whether resting on the doctrines of religion or by the microscopic scrutiny of science, aside from being false claims, necessarily involves a diminution of life itself. That at its most extreme, this new religion of scientific materialism leads unswervingly to what William Blake called “the sleep of Newton”: a mindfulness only to what can be measured and calculated. And truly this requires a tremendous sacrifice.

*

James Tunney, LLM, is an Irish Barrister who has lectured on legal matters throughout the world. He is also a poet, visual artist, and author of “The Mystical Accord: Sutras to Suit Our Times, Lines for Spiritual Evolution”. In addition, he has written two dystopian novels – “Blue Lies September”, and “Ireland I Don’t Recognize Who She Is”. Here he speaks with host of “New Thinking Allowed”, Jeffrey Mishlove about the ‘Perennial Philosophy’ tradition found in cultures throughout the world, for which the essential core tenet is mysticism. What is meant by mysticism is discussed at length, and as Tunney explains, one important characteristic shared by all mystical traditions is the primary recognition of humans (and animals) as spiritual beings. Thus, scientism as a cultural force, by virtue of its absolutist materialist dogma, is necessarily antagonistic to all forms of mysticism:

*

So by degrees I’ve been converted back to agnostism, for all its shamefulness. Agnosticism meaning “without knowledge”. I really have no idea whether or not a god of any useful description exists, nor even whether this is a reasonable question, yet I can still confidently rule out many of his supposed manifestations (especially those where his name is top-heavy with its illuminated capital G). But any detailed speculation on the nature of god or, if you prefer, the spiritual, is what William James calls “passing to the limit”, and in passing that limit we come to what James called the “over-beliefs”.

Over-beliefs are the prime religious currency in which churches do the bulk of their business. They are what most distinguish the Lutherans from the Catholics; the Sunnis from the Shias; and more schismatically again, the Christians from the Muslims. All the carefully formulated dogma about the Holy Trinity, the Immaculate Conception, the virgin birth; the sacraments and the catechisms; and the ways of invocation of the One True God; or in more Easterly traditions, the karmic cycle and the various means and modes of reincarnation, and so on and so forth, all are over-beliefs, for they attempt to cross the threshold from “the sensible and merely understandable world” to “the hither side”. In his own conclusions, James suggested a more “pluralistic hypothesis” to square the varieties of religious experience:

Meanwhile the practical needs and experiences of religion seem to me sufficiently met by the belief that beyond each man and in a fashion continuous with him there exists a larger power which is friendly to him and to his ideals. All that the facts require is that the power should be other and larger than our conscious selves. Anything larger will do, if only it be large enough to trust for the next step. It need not be infinite, it need not be solitary. It might conceivably even be only a larger and more godlike self, of which the present self would then be but the mutilated expression, and the universe might conceivably be a collection of such selves, of different degrees of inclusiveness, with no absolute unity realized in it at all…

These are James’ overbeliefs and they broadly concur with my own. Though mine have also been tinted a little by Eastern hues. Intuitively I am drawn by the Taoist notion of the constant flux of eternal becoming. An unnameable current of creation with an effortless strength like the strength of water, which is subtle, flexible and unstoppable. Accordingly, my intuition respects the Taoist directive to flow effortlessly with this eternal current, for there is no sense in swimming against it. And this is a philosophy that compliments well the mindfulness of Zen (or Ch’an), with its playful seriousness, its snapping fingers calling the wandering attention back to the here and now. I can easily empathise with the Zen student’s search for the raw nakedness of naked existence, with its requirement to the strip all veils of presumed understanding; focusing upon where the outer and inner worlds reflect, to achieve a spontaneous but ineffable awakening. I can see it as a potentiality, and it does not jar against the hard-won rationality of my scientific training. In contrast to so much of the declarative wiseacring of Western philosophy, mastery of both disciplines is all about knowing when to shut up. As mythologist Joseph Campbell, author of The Hero with a Thousand Faces, once said:

God is a thought, God is an idea, but its reference is to something that transcends all thinking. I mean, he’s beyond being, beyond the category of being or nonbeing. Is he or is he not? Neither is nor is not. Every god, every mythology, every religion, is true in this sense: it is true as metaphorical of the human and cosmic mystery. He who thinks he knows doesn’t know. He who knows that he doesn’t know, knows. 19

I am not of course a Taoist nor a Buddhist of any kind. I am unaffiliated to any church. But I am drawn to Taoism and Zen Buddhism because of their appeals to objectivity, with emphasis on revelation above and beyond belief. For in neither Taoism nor Zen is any shape of God decreed or delineated: God being as much a zero and a one. And as a one-nothing, or a no-thing, this no-God requires no sacrifice, no high calls to blind obedience; for the Universe is as the Universe does. Yet something of the religious remains, beyond the purely philosophical, a something that strict atheism lacks: a personal role within the cosmic drama, which escapes the absurd chance and purposeless drifting of materialist scientism. 20

So it is that I choose to adopt them to an extent. To draw on their philosophies, and to marry these on again with ideas found in strands of Western Existentialism, to aspects of liberal humanism and to the better parts of Christianity (distilled in the songs of Blake, for instance). But whilst it may be edifying to pick the best from traditions of both East and West, to satisfy my god-shaped hole, I see too that such a pick-and-mix approach is prone to make as many false turns as any traditional religious route – it is interesting to note here that the word “heretic” derives from the Greek hairetikos, meaning “able to choose”. For there are no actual boundaries here. So what of the many shamanic traditions and tribal gods of primitive society? What about our own pagan heritage? Isn’t it time to get out the crystals and stuff some candles in my ears? Mesmerised by a hotchpotch of half-comprehended ideas and beliefs, just where are the safeguards preventing any freewheeling religious adventurer from falling into a woolly-headed New Ageism?

Well, it’s not for me or anyone else to call the tune. Live and let live – everyone should be entitled to march to the beat of their own drums, always taking care not to trample the toes of others in the process. But this idea of the New Age is a funny business, and I wish to save my thoughts on that (perhaps for another book). Meanwhile, my sole defence against charges of constructing a pick-and-mix religion is this: if you’d lost your keys where would you look for them? In your pocket? Down at your feet? Only under the streetlights? Oh, you have your keys – well then, good for you! Now, please don’t expect everyone else to stop looking around for theirs, or restricted to searching only under the most immediate and convenient lamppost.

Having said all this, and rather shamefully spoken too much on matters that better deserve silence, it now behoves me to add that I am certainly careful when it comes to choosing between personal over-beliefs, adhering to one rule: that what is discredited by steadfast and rigorous scientific trial is guaranteed baloney. Miracles, of course, are quite out of the question, failing on account of their own self-defining impossibility. Equally I have no time for animalistic gods of any persuasion, whether or not they share a human face. But my deepest distrust is not of religions per se (since, to repeat, these are many and varied in form, and then good and bad in parts), but more specifically, for the seemingly numberless religious organs we call creeds, sects, churches and so on.

To contend that religion is always about power is to miss the bigger picture, as I hope I’ve satisfactorily shown, and yet… It would be wise for the sheep to beware the shepherd. This much agreed, however, I feel sure that religion, in some wiser form, still has an important role to play in many of our individual lives and for the sake of all our futures. You may be surprised to learn that George Orwell thought similarly, and made his opinion felt in his essay Notes on the Way (an essay which, at intervals, I shall return to later):

… Marx’s famous saying that ‘religion is the opium of the people’ is habitually wrenched out of its context and given a meaning subtly but appreciably different from the one he gave it. Marx did not say, at any rate in that place, that religion is merely a dope handed out from above; he said that it is something the people create for themselves to supply a need that he recognized to be a real one. ‘Religion is the sigh of the soul in a soulless world. Religion is the opium of the people.’ What is he saying except that man does not live by bread alone, that hatred is not enough, that a world worth living in cannot be founded on ‘realism’ and machine-guns? If he had foreseen how great his intellectual influence would be, perhaps he would have said it more often and more loudly. 21

Next chapter…

*

Addendum: mind over matter

Physicists speak about a ‘quantum theory’ but when asked what the physical reality this ‘theory’ describes is truly like, they have no useful or consistent answers at all. It works, they say, and at a mathematical level is the most precise ‘theory’ so far devised, so “shut up and calculate!” Or, if you prefer (with apologies to Shelley): look upon our quantum works and do not despair… certainly not about any gaps in our understanding about the true nature of reality that may or may not underlie it. This non-philosophical culture was the norm by the time I went to university; an opinion that was seldom if ever challenged and thus easily instilled.

Of course, quantum reality does come as a shock at first. I had genuinely felt an acute anxiety on first hearing of Schrödinger’s poor cat forever half-dead in her box. Not that we ever learnt about the famous thought experiment in class of course: no, physics abandoned Schrödinger’s cat to her interminable state of limbo long ago. Any underlying ontology was reading for pleasure only; a late-night topic for post-pub discussions.

But physics is mistaken in its beliefs. It has mixed up its modern ignorance with ultimate incomprehensibility. Schrödinger’s cat was actually meant to shock us all: most importantly, to wake up all those physicists who chose to interpret the abstraction as the world itself and decide without proof that nothing of reality exists beyond it. But we have incorporated the semi-corporeal cat into the mix of quantum oddities: as evidence of our unreal reality when the whole point was that such quantum half-death is absurd.

Moreover, what physicists today describe as ‘quantum theory’ is not strictly a theory at all but actually just a powerful predictive recipe and an engineering tool, whereas a genuine theory is yet to be written: the true quest for it is disguised by language again, because this potential future theory is what physicists currently sideline under the label ‘interpretations’ – as if they don’t much matter.

Professor of Philosopher at NYU, Tim Maudlin, explains the problem with quantum theory today and how the foundations of quantum mechanics should be understood (please ignore the perturbing observable in the background!):

Although the notion that consciousness plays a key role in quantum mechanics was seriously considered by many of the scientific luminaries of the early Twentieth Century including John von Neumann who discussed its salient role in his treatise The Mathematical Foundations of Quantum Mechanics, such interpretations have since fallen mostly out of favour (certainly amongst physicists). More recent empirical findings are however just beginning to challenge this scientific orthodoxy and may indeed rock the assertion that there is an inherent distinction between what I above called “quantum choice” and our conscious choice. In fact in contradiction to what I originally wrote, some of the latest studies are producing results that show astonishingly high correlation between conscious intention and the so-called “collapse” of the wave function.

The last word (of this chapter – not the subject!) I shall leave to Freeman Dyson:

I cannot help but think that the awareness of our brains has something to do with the process that we call “observation” in atomic physics. That is to say, I think our consciousness is not just a passive epiphenomenon carried along by the chemical events in our brains, but is an active agent forcing the molecular complexes to make choices between one quantum state and another. In other words, mind is already inherent in every electron, and the processes of human consciousness differ only in degree but not in kind from the processes of choice between quantum states which we call ‘chance’ when they are made by electrons.

*

Please note that for the purposes of ‘publishing’ here I have taken advantage of the option to incorporate hypertext links and embed videos – in order to distinguish additional commentary from the original text all newly incorporated text has been italised.

*

1    Not quite true actually. Apparently my father was one of a small number who decided not to bother watching the first men step onto the moon’s surface. He tells me that he was so sure they would make it, he couldn’t see the point. My mother watched, and apparently so did I, although still not two years old. I can’t say that I remember anything about the moment, and probably found it a lot less interesting than Bill and Ben The Flowerpot Men, but perhaps it affected me on some deeper level — could it be that seeing the first moon landing at such a tender age was part of the reason I ended up studying comets?

2    Radiation pressure is the consequence of light itself (photons) having momentum.

3    A process that releases energy to the surroundings in the form of work as opposed to endergonic, which means energy consuming. These terms are closely related to exothermic and endothermic, where energy release and absorption take the form of heat transfer.

4    Karl Popper’s precise “line of demarcation” was that, if any theory can be shown to be falsifiable, then it can usefully be described as scientific.

5

“The totality of true propositions is the whole of natural science (or the whole corpus of the natural sciences).”

— Wittgenstein, Tractatus Logico-Philosophicus, 4.11

6

“The whole modern conception of the world is founded on the illusion that the so-called laws of nature are the explanations of natural phenomena. Thus people today stop at the laws of nature, treating them as something inviolable, just as God and Fate were treated in past ages. And in fact both were right and both wrong; though the view of the ancients is clearer insofar as they have an acknowledged terminus, while the modern system tries to make it look as if everything were explained.” — Wittgenstein, Tractatus Logico-Philosophicus, 6.371-2.

7    In German: “Wovon man nicht sprechen kann, darüber muß man schweigen.”

8    “the truth of the thoughts that are here communicated seems to me unassailable and definitive.” Taken from the preface to the Tractatus Logico-Philosophicus.

9    In this first treatise of Wittgenstein (which was the only one he ever published – his later philosophy contained in “The Philosophical Investigations” being published posthumously), he begins with the totally unsupported and deeply contentious assertion that, in effect, all meaningful language involves a description, or more correctly a depiction, of fact. This follows because the use of all language involves a correlation between objects in the world and names for those objects. This is his so-called “picture theory of language” which requires, Wittgenstein claims, a one-to-one correspondence between names and objects. This given, he demonstrates that if any proposition is to be genuine it must have a definite sense, or to put it differently, for a statement to admit to any test of proof then it must at least be possible for that question to be set out absolutely clearly. For Wittgenstein this means that questions about ethics, aesthetics and theology fall outside the realm of philosophy; the reason being that they rely on words such as “goodness”, “beauty”, “truth” and “god” which have no clear one-to-one correspondence. Wittgenstein of course later changed his mind on some of this. Recognising that his picture theory was overly simplistic he returned to philosophy with a radically new idea. That the meaning of language is contained in its social usage, thereby reassigning the work of philosophers to the study of language within its natural social environment. The purpose of philosophy was now to untie the knots of these so-called “language games”. But it is easy to mistake him here – and many do – his notion being that science can properly be understood and appraised only by those who know its language, religion likewise, and so on. And not that all inquiry is merely a matter of “playing with words”.

10  The Varieties of Religious Experience: a Study in Human Nature by William James, Longmans, Green & co, 1902; from a lecture series.

11  Ibid.

12  Ibid.

13  Ibid. Italics maintained from the original source.

14  Ibid. James earlier says, “It is absurd for science to say that the egotistic elements of experience should be suppressed. The axis of reality runs solely through the egotistic places, – they are strung upon it like so many beads.”

15  Genesis Ch.22 tells how God commanded Abraham to go to the land of Moriah and to there offer up his own son Isaac as a sacrifice. The patriarch travels three days until finally he comes to the mountain, just as God had instructed, and there he tells his servant to remain until he and Isaac have ascended the mountain. Isaac, who is given the task of carrying the wood on which he will soon be sacrificed, repeatedly asks his father why there is no animal for the burnt offering. On each occasion, Abraham says that God will provide one. Finally, as Abraham draws his knife and prepares to slaughter his son, an angel stops him. Happily, a ram has been provided and it can now be sacrificed in place of Isaac.

16  This is sometimes called “the riddle of Epicurus” or “the Epicurean Paradox” even though Epicurus did not in fact leave behind any written record of this statement. The first record of it appears some four hundred or more years after and in a work by the early Christian writer Lactantius who is actually criticising the argument.

17  Freeman Dyson is undoubtedly one of the greatest scientists never to win the Nobel Prize. However, he was awarded the Lorentz Medal in 1966 and Max Planck medal in 1969. In March 2000 he was also awarded the Templeton Prize. Created in 1972 by the investor, Sir John Templeton, in an attempt to remedy what he saw as an oversight by the Nobel Prizes, which do not honour the discipline of religion. Previous Templeton Prize recipients have included the Rev. Dr. Billy Graham, Aleksandr Solzhenitsyn, Charles Colson, Ian Barbour, Paul Davies, physicist Carl Friedrich von Weizsacker, and Mother Teresa.

18  Extracts from Freeman Dyson ‘s acceptance speech for the award of the Templeton Prize, delivered on May 16, 2000 at the Washington National Cathedral.

19 From an interview conducted in 1987 by American journalist Bill Moyers as six-part series of conversations with Joseph Campbell entitled Joseph Campbell and the Power of Myth. The quote is taken from Episode 2, ‘The Message of the Myth’ broadcast on June 26, 1988. The full transcript is available here: https://billmoyers.com/content/ep-2-joseph-campbell-and-the-power-of-myth-the-message-of-the-myth/  

20  It is even tempting to envisage some grand union of these two ancient Chinese philosophies, called Zow!-ism perhaps.

21  Extract taken from Notes on the Way by George Orwell, first published in Time and Tide. London, 1940.

Leave a comment

Filed under « finishing the rat race »

the stuff of dreams

The following article is Chapter Two of a book entitled Finishing The Rat Race.

All previously uploaded chapters are available (in sequence) by following the link above or from category link in the main menu, where you will also find a table of contents and a preface on why I started writing it.

*

Oats and beans and barley grow,

Oats and beans and barley grow,

Do you or I or anyone know,

How oats and beans and barley grow?

— Traditional children’s rhyme

*

One of my earliest memories at school was being told that rabbits became quick to escape foxes, and likewise, foxes had become quicker to catch rabbits. This, the teacher said, is how one type of animal can slowly change into a new type through a process known as evolution. Well, I didn’t believe that for a minute. Such dramatic outcomes from such unremarkable causes. And why, I wondered, would something change simply because it had to – having to isn’t any reason.

Of course in many ways my teacher had missed the point (though in fairness, perhaps it was I who missed his point, off in a daydream, or curiously intent on the inconstant fluttering of a leaf against the window, or otherwise lost to the innocent pleasures of childhood reveries). Either way it doesn’t matter much. Importantly, my teacher had done his job – and done it well! He had planted a seed, which made this a most valuable lesson. But in his necessarily simplified account of evolution there was a flaw (and his version would by virtue of necessity have been a simple one, because however much I may have been distracted, the subtleties of evolution were beyond the grasp of our young minds). For what he had missed out is not why the rabbits became faster but how. The question being what “adaptive mechanism” could have driven any useful sequence of changes we might call ‘evolution’. And this is really the key point. Leaving out mention of any kind of adaptive mechanism, he was leaving open all sorts of possibilities. For instance, Lamarckism and Darwinism, though both theories of evolution, paint very different accounts of how life has developed, for they presume quite different adaptive mechanisms. I will try to explain the matter more carefully and in terms of giraffes.

*

You might ask a great many questions about giraffes. For instance, how on earth their extraordinary and striking markings could ever provide useful camouflage, though if you’re ever lucky enough to see one step almost invisibly out of dappled foliage into full light, you will certainly be sure that the effect is near perfect. Alternatively, you might ask why it is that they walk with both legs on the same side moving together. A very elegant form of locomotion. However, by far and away the most frequently asked question about giraffes is this: why do they have such long necks?

Well, here’s what Lamarck would have said. Giraffes began as ordinary antelope. Some of the antelope preferred grass and others preferred leaves. The ones that preferred leaves had an advantage if they could reach higher. To achieve this they would stretch their necks a little longer. As a direct result of acquiring this new characteristic, the foals of those slightly longer necked antelope would be also be born with slightly longer necks. They too would stretch that little bit higher. Over generations some types of the antelope would develop extremely long necks and the descendants of these eventually developed into a new species called giraffes.

The basis for Lamarck’s reasoning lies in a perfectly rational misunderstanding about genetics. He assumes that the “acquired characteristics” (i.e., those characteristics developed or acquired during life) of the parents will somehow be passed through to their offspring. It turns out however that this isn’t actually the case. He might have guessed as much I suppose. One of the oft-cited criticisms against Lamarck’s theory has been the case of Jewish boys. Why, his opponents would ask, do they ever grow foreskins in the first place?

Darwin offered an alternative hypothesis. Perhaps it goes like this, he thought: there are already differences within the population of antelope; some will have shorter necks than others to start with. Or in other words, there is already a “natural variation”. In times of plenty this may not be of significance, but in times of scarcity it could be that the antelope with longer necks have a slight advantage. This idea of course applies to any antelopes with other accidentally favourable characteristics, for example those that run faster, are better camouflaged, or have more efficient digestive systems; but let’s not go there – let’s stick to necks for a moment. The longer necked adults can reach higher and so get to those few extra leaves that will help them to survive. Having a slightly higher chance of survival means (all other factors being equal) that they are more likely to pass on their characteristics. Within a few generations there will be an inevitable increase in the population of the long-necked variety until eventually, the long-necked population might plausibly have evolved into a separate species.

What had Darwin achieved in this alternative explanation? Well, he had abolished any requirement for an heredity that depended on the transmission of “acquired characteristics.” He’d not entirely proved Lamarck wrong but only shown his ideas aren’t necessary. And although in actual fact Darwin never acknowledged Lamarck’s contribution, purely in terms of theories of heredity his own version was little better than Lamarck’s (basically, by introducing the equally flawed concept of pangenes he had finally got around the issue of Jewish foreskins). But it is not what Darwin had undermined, so much as what he had set up, that preserves his legacy. That the true driving force of evolution depends on variation and competition, in dynamic relationship that he called “natural selection”.

According to Darwin’s new vision then, the evolution of species depends upon how individuals within that species interact with their environment. Those that are best adapted will survive longer and pass on their winning characteristics, and the rest will perish without reproducing. In short, it is “the survival of the fittest” that ensures evolutionary progress; though this catchy summary was not Darwin’s own, but one that Darwin slowly adopted. (It was actually first coined by the philosopher Herbert Spencer, whose ideas I wish to return to later.)

*

Darwin still attracts a lot of criticism and much of this criticism comes from religious sections intent on promulgating the view that “it was God what done it all” –  the Creationists who refuse to acknowledge any of the overwhelming evidence whether from zoology, botany, geology, palaeontology, or embryology; rejecting reason in deference to “the word of God”. However, there are also more considered critiques.

Perhaps the most interesting of these is that Darwin’s evolutionary theory of natural selection is unscientific because it is founded on a tautology. It is after all self-evident that the fittest will survive, given that by fitness you must really be meaning “fitness for survival”. After all, it has to be admitted that sloths have survived, and in what sense can a sloth be said to “be fit” other than in its undoubted fitness to be a sloth. The assumption then is that Darwin’s idea of natural selection has added nothing that wasn’t already glaring obvious. Yet this is an unfair dismissal.

Firstly, it is unfair, because as I have said above, “the survival of the fittest” is Spenser’s contribution – one that leads rapidly into dangerous waters – but it is also unfair because it misses the way in which Darwin’s hypothesis is not only predictive, but also (as Karl Popper was so keenly aware) testable. If Darwin’s theory was a mere tautology then nothing on earth could ever disprove his claims, and yet there is room here for evidence that might truly test his theory to destruction.

How? Well, Darwin, it must be understood, had put forward a theory of gradual adaptation, so there is no accounting for any sudden leaps within his slowly branching history of life – so if, for instance, a complex new order of species suddenly arose in the fossil record without ancestry, then Darwin’s theory would need a radical rethink. Or let’s say some fossil was found with characteristics uncommon to any discovered ancestor. Here again Darwin’s theory would be seriously challenged. On the other hand, embryologists might discover discrepancies in the way eggs develop, and likewise, following the discovery of DNA and advent of modern genetics, we might find sudden abrupt shifts in the patterns of genes between species instead of gradual changes. Each of these cases would powerful evidence to challenge Darwinian theory.

But, instead of this (at least until now), these wide and varied disciplines have heaped up the supporting evidence. For example, people used to talk a lot about “the missing link”, by which they generally meant the missing link between humans and apes when scientists have in fact discovered a whole host of “missing links” in the guise of close cousins from the Neanderthals to the strange and more ancient australopithecines. For more exciting missing links, how about the fact that the jaw bone of reptiles exists in four parts and that three of those bones have slowly evolved in humans to form parts of the inner ear. How do we know? Well, there is evidence in the development of mammalian and reptilian embryos and more recently the discovery of an intermediate creature in which the bones were clearly used concomitantly for both chewing and listening. This is one of many discovered creatures that Darwin’s theory has predicted – whilst the most famous is surely the bird-lizard known as Archaeopteryx. Where, by way of comparison, are the remains of, say, Noah’s Ark?

But Darwin’s theory was not correct in all details. As I have already mentioned, his notion of pangenes was in some ways little better than Lamarck’s theory of acquired characteristics, and so it is perhaps still more remarkable that whilst he looked through a wonky glass, what he gleaned was broadly correct. Although, surprisingly perhaps, it took a monk (and one trained in physics more than in biology) to begin setting the glass properly straight. Enter Gregor Mendel.

Richard Dawkins shows how whales evolved from a cloven-hoofed ancestor, and reveals whales’ closest modern-day cousin:

*

If we think back to what people knew about the world (scientifically speaking) prior to the turn of the twentieth century, it seems astonishing what was about to be discovered within just a few decades. For instance, back in 1900 physicists were still in dispute about the existence of atoms, and meanwhile, astronomers were as then unaware of the existence of independent galaxies beyond the Milky Way. But then, in 1905, Einstein suddenly published three extraordinary papers. In the least well known of these, he proved mathematically how the jiggling Brownian motion of pollen grains on water (observed by Robert Brown almost a hundred years earlier) was caused by collisions of water molecules, and doing this, he finally validated the concept of matter being formed out of particles, and so by extension, thereby proven the existence of atoms, which finally settled a debate regarding the nature of matter that had begun more than two thousand years earlier in Greece.

Moreover, it wasn’t until the early 1920s, when Edwin Hubble (now better known as the father of the idea of the expanding universe) had succeeded in resolving the outer parts of other galaxies (previously called nebulae), detecting within their composition the collections of billions of individual stars. At last we knew that there were other galaxies just like our own Milky Way.

So in just twenty years, our universe had simultaneously grown and shrunk by a great many orders of magnitude. Nowadays, of course, we know that atoms are themselves composed of smaller particles: electrons, protons and neutrons, which are in turn fashioned from quarks 1; while the galaxies above and beyond congregate within further clusters (the Milky Way being one of the so-called Local Group, which is surely the most understated name for any known object in the whole of science).

The universe we have discovered is structured in multiple layers – though the boundaries between these layers are only boundaries of incomprehension. Looking upwards we encounter objects inconceivably large are in turn the building blocks of objects much larger again, whilst investigating the finest details of the particle world, we’ve learnt how little fleas have ever smaller fleas…

Our first stabs at understanding the origins of the trillions of galaxies in our visible universe, and of comprehending the nature of the matter and energy that comprises them, has lead to speculations based upon solid empirical findings that allow us to construct models of how the physical universe as a whole may have begun. Thus, via a joint collaboration between physicists searching on the macro- and micro-scales, we have finished up with the study of cosmology; the rigorous scientific study of the cosmos no less! (And to most physicists working at the turn of the twentieth century, the idea of a branch of physics solely devoted to the understanding of creation would surely have seemed like pure science fiction). I hope my digression has helped to set the scene a little…

*

Around the turn of the twentieth century, there also remained a mystery surrounding the science of heredity and the origin of genes. It was of course common sense that children tended to have characteristics reminiscent of their parents, but in precisely what manner those parental characteristics were hybridised had remained a matter of tremendous speculation. It was still widely believed that some kind of fluid-like mingling of genes occurred, little substantial scientific progress having been made on the older ideas about bloodlines.

But those early theories of blended inheritance, which imagined the infusing together of the two gene pools, as two liquids might mix, were mistaken. If genes really behaved this way then surely the characteristics of people would also blend together. Just as we add hot water to cold to make it warm, so a white man and a black woman would surely together procreate medium brown infants, becoming darker or lighter by generations depending on whether further black or white genes were added. Which is indeed true, up to a point, but it is not strictly true. And if it really were so simple, then the range of human characteristics might (as some racial purists had feared) gradually blend to uniformity. But the real truth about inheritance, as Mendel was quietly discovering during the middle of the 19th century, is that genes have an altogether more intriguing method of combination.

*

Mendel was a monk, who aside from observing the everyday monastic duties also taught natural science, principally physics. The work that eventually made him world-renowned, however, involved studies on peas; this was Mendel’s hobby.

He spent many years cross-fertilising varieties and making detailed observations of the succeeding generations. He compared the height of plants. He compared the positioning of flowers and pods on the stem. And he noted subtle differences in shape and colour of seeds, pods and flowers. By comparing generations, Mendel found that offspring showed traits of their parents in predictable ratios. More surprisingly, he noticed that a trait lost in one generation might suddenly re-emerge in the next. So he devised a theory to explain his findings. Like a great many scientific theories, it was ingenious in its simplicity.

Within every organism, he said, genes for each inheritable trait must occur not individually, but in pairs, and in such a way that each of these “gene-pairs” is either “dominant” or “recessive” to its partner. In this way, a gene could sometimes be expressed in the individual whilst in different circumstances it might lay dormant for a generation. But please allow me a brief paragraph to explain this modern concept of inheritance more completely and coherently.

The usual way to explain Mendelian Inheritance is in terms of human eye colours. It goes like this: There is one gene for eye colour, but two gene types. These are called “alleles”, meaning “each other”. In this case, one allele produces brown eyes (let’s call this Br), and the other produces blue eyes (Bl). You inherit one of these gene types from your mother and one from your father. So let’s say you get a brown allele from each. That means you have Br-Br and will have brown eyes. Alternatively you may get a blue allele from each, and then you’ll have Bl-Bl and so have blue eyes. So far so simple. But let’s say you get a brown from one parent and a blue from the other. What happens then? Well, Mendel says, they don’t mix, and produce green eyes or something, but that one of the genes, the brown one as it happens, will be “dominant”, which means you will have brown eyes. But here’s the interesting bit, since although you have brown eyes you will nevertheless carry an allele for blue eyes – the “recessive” allele. Now let’s say you happen to meet a beautiful brown-eyed girl, who is also carrying the combined Br-Bl genes. What will your beautiful children look like? Well, all things being equal in terms of gene combination – so assuming that you are both equally likely to contribute a Bl allele as a Br allele (i.e., that this is a purely random event) then there are only four equal possibilities: Br-Br, Br-Bl, Bl-Br, or Bl-Bl. The first three of these pairs will produce dominant brown, whilst the two recessive Bl alleles in the last pair produce blue. So if you happen to have four children, then statistically speaking, you are most like to produce three with honey brown eyes, and one imbued with eyes like sapphires. And the milkman need never have been involved.

Mendel had realised that instead of the old fashioned “analogue” system, in which our genes added together in some kind of satisfactory proportions – like two voices forming a new harmony – genes actually mix in an altogether more “digital” fashion, where sometimes, the gene type is on and sometimes it is off. Inevitably, the full truth is more complicated than this, with alleles for different genes sometimes combining in other ways, which will indeed lead to blending of some kinds of inherited traits. Yet even here, it is not the genes (in the form of the alleles) that are blended, but only the “expressed characteristics” of that pair of alleles – something called the phenotype. Thus, for generation after generation these gene types are merely shuffled and passed on. Indeed the genes themselves have a kind of immortality, constantly surviving, just as the bits and bytes in computer code are unaltered in reproductions. Of course, errors in their copying do eventually occur (and we now know that it is precisely such accidental “mutations” which, by adding increased variety to the gene pool, have served to accelerate the process of evolution). 2

Mendel’s inspired work was somehow lost to science for nearly half a century, and so although he was a contemporary of Darwin and knew of Darwin’s theory – indeed, Mendel owned a German translation of “On the Origin of Species”, in which he had underlined many passages – there is absolutely no reason to suppose that Darwin knew anything at all of Mendel’s ideas.

*

When Mendel’s papers were finally recovered in 1900, they helped set in motion a search for a molecular solution to the question of biological inheritance; a search that would eventually lead to Crick and Watson’s dawning realisation that the structure of DNA must take the form of an intertwined double-helix. Such an extraordinary molecule could peel apart and reform identical copies of itself. DNA, the immortal coil, the self-replicating molecule that lay behind all the reproductive processes of life, sent biologists (not least Crick and Watson) into whirls of excitement. It was 1953 and here was the biological equivalent to Rutherford’s momentous discovery of an inner structure to atoms, almost half a century earlier. Here was the founding of yet another new science. Whilst nuclear and particle physicists were finding more powerful ways to break matter apart, biologists would soon begin dissecting genes.

Aside from the direct consequences of current and future developments in biotechnology (a subject I touch on in the addendum below), the rapid developments in the field of genetics, have led to another significant outcome, for biologists have also slowly been proving Darwin’s basic hypothesis. Genes really do adapt from one species to another – and we are beginning to see just precisely how. Yet in complete disregard to the mounting evidence, evolutionary theory still comes under more ferocious attack than any other established theory in science. Why does Darwinism generate such furore amongst orthodox religious groups compared say to today’s equally challenging theories of modern geology? Why aren’t creationists so eager to find the fault with the field of Plate Tectonics? (Pardon the pun.) For here is a science in its comparative infancy – only formulated in the 1960s – that no less resolutely undermines the Biblical time-scale for creation, and yet it reaps no comparable pious fury. Rocks just aren’t that interesting apparently, whereas, anyone with the temerity to suggest that human beings quite literally evolved from apes… boy, did that take some courage! 3

*

Now at last, I will get to my main point, which is this: given that the question of our true origins has now been formally settled, what are we to conclude and what are the consequences to be? Or put another way, what’s the significance of discovering that just a million years ago – a heartbeat when gauged against the estimated four billion years of the full history of life on Earth – our own ancestors branched off to form a distinct new species of ape?

Well, first and foremost, I think we ought to be clear on the fact that being such relative terrestrial latecomers gives us no grounds for special pleading. We are not in fact perched atop the highest branch of some great evolutionary tree, or put differently, all creation was not somehow waiting on our tardy arrival. After all, if evolution is blind and not goal-orientated, as Darwinism proposes, then all avenues must be equally valid, even those that were never taken. So it follows that all creatures must be evolutionarily equal. Apes, dogs, cats, ants, beetles (which Darwin during his own Christian youth had noted God’s special fondness for, if judged only by their prodigious profusion), slugs, trees, lettuces, mushrooms, and even viruses; his theory makes no preference. All life has developed in parallel, and every species that is alive today, evolved from the same evolutionary roots and over the same duration simply to reach the tips of different branches. The only hierarchy here is a hierarchy of succession – of the living over the dead.

In short then, Darwinism teaches that we are just part of the great nexus of life, and no more central or paramount than our planet is central to the universe. To claim otherwise is to be unscientific, and, as Richard Dawkins has pointed out, depends entirely upon anthropocentrism and the “conceit of hindsight”.

Darwin too, quietly recognised that his theory provided no justification for any such pride in human supremacy. Likewise, he refused to draw any clear distinction between human races, correctly recognising all as a single species; an admission that says much for his intellectual courage and honesty, challenging as it did, his otherwise deeply conservative beliefs. For Darwin was a Victorian Englishman, and although not a tremendously bigoted one, it must have been hard for him to accept, that amongst many other things, his own theory of evolution meant that all races of men were of equal birth.

*

But if we agree that humans are a specialised kind of ape, then we need to be fair in all respects. We have got in a habit of presuming that mankind, or homo sapiens – “the wise man” – to apply our own vainglorious scientific denomination – of all the countless species on Earth, is the special one. Unique because, as it used often to be claimed, we alone developed the skill to use tools. Or because we have a unique capacity for complex communication. Or because we are unparalleled creators of wonderful music and poetry. Or because we are just supremely great thinkers – analytical to the point of seeking a meaning in the existence of existence itself. Or more simply, because we are self-aware, whereas most animals seem childishly oblivious even to their own reflected images. Or, most currently fashionable, because as a species we are uniquely sophisticated in an entirely cultural sense – that is, we pass on complex patterns of behaviour to one-another like no other critters.

All of our uniqueness, we owe, so it goes, to the extraordinary grey matter between our ears, with everything boiling down eventually to this: we are special because we are such brainy creatures – the cleverest around. But think about it: how can we actually be sure even in this conviction? For what solid proof have we that no other creatures on Earth can match our intellectual prowess?

Well, we might think to look immediately to brain size, but there’s a catch, as it turns out that bigger animals have bigger brain-needs merely to function. Breathing, regulating blood temperature, coping with sensory input, and so on, all require more neural processing the larger a creature becomes. So we must factor this into our equations, or else, to cite a singular example, we must concede that we are much dumber than elephants.

Okay then, let’s divide the weight of a brain by the weight of the animal it belongs to. We might even give this ratio an impressive label such as “the encephalisation quotient” or whatever. Right then, having recalibrated accordingly, we can repeat the measures and get somewhat better results this time round. Here goes: river dolphins have an EQ of 1.5; gorillas 1.76; chimpanzees 2.48; bottlenose dolphins 5.6; and humans an altogether more impressive 7.4. So proof at last that we’re streets ahead of the rest of life’s grazers. But hang on a minute, can we really trust such an arbitrary calculus? Take, for example, the case of fatter humans. Obviously they must have a lower average EQ than their thinner counterparts. So this means fatter people are stupider?

No, measurements of EQ might better be regarded as an altogether rougher indication of intelligence: a method to sort the sheep from the apes. But then, can you actually imagine for a minute, that if say, EQ gave higher results for dolphins than humans, we would ever have adopted it as a yardstick in the first place? Would we not have more likely concluded that there must be something else we’d overlooked besides body-mass? The fact that dolphins live in water and so don’t need to waste so much brain energy when standing still, or some such. For if we weren’t top of the class then we’d be sure to find that our method was flawed – and this becomes a problem when you’re trying to be rigorously scientific. So either we need more refinement in our tests for animal intelligence, with emphasis placed on being fully objective, or else we must concede that intelligence is too subtle a thing even to be usefully defined, let alone accurately scored.

However, a more bullish approach to our claims of greatness goes as follows: look around, do you see any other creatures that can manipulate their environment to such astonishing effects? None has developed the means to generate heat or refrigeration, to make medicines, or to adapt to survive in the most inhospitable of realms, or any of our other monumental achievements. Dolphins have no super-aqua equipment for exploring on land, let alone rockets to carry them to the Sea of Tranquility. Chimpanzees have never written sonnets or symphonies – and never will no matter how infinite the availability of typewriters. So the final proof of our superiority then is this, whether we call it intelligence or give it any other endorsement: technological achievement, artistic awareness, and imagination of every kind.

But what then of our very early ancestors, those living even before the rise of Cro-magnon 4, and that first great renaissance which happened more than 40,000 years ago. Cro-magnon people made tools, wore clothes, lived in huts, and painted the wonderful murals at Lascaux in France and at Altamira in Spain. They did things that are strikingly similar to the kinds of things that humans still do today. Homo sapiens of earlier times than these, however, left behind no comparable human artifacts, and yet, physiologically-speaking, were little different from you or I. Given their seeming lack of cultural development then, do we have justification for believing them intellectually inferior, or could it be that they simply exercised their wondrous imaginations in more ephemeral ways?

Or let’s take whales, as another example. Whales, once feared and loathed as little more than gigantic fish, are nowadays given a special privilege. Promoted to the ranks of the highly intelligent (after humans obviously), we have mostly stopped brutalising them. Some of us have gone further again, not merely recognising them as emotionally aware and uncommonly sensitive creatures, but ‘communing with them’. Swimming with dolphins is nowadays rated as one of the must-have life experiences along with white-water rafting and bungee jumping. So somehow, and in spite of the fact that whales have never mastered the ability to control or manipulate anything much – tool-use being a tricky business, of course, if you’re stuck with flippers – nevertheless, whales have joined an elite class: the “almost human”. We have managed to see beyond their unbridgeable lack of dexterity, because whales satisfy a great many of our other supposedly defining human abilities – ones that I outlined above.

Dolphins, we learn, can recognise their own reflections. And they use sounds, equivalent to names, as a way to distinguish one another – so do they gossip? How very anthropomorphic of me to ask! Also, and in common with many other species of cetaceans, they sing, or at least communicate by means of something we hear as song. Indeed, quite recent research based on information theory has been revealing; mathematical analysis of the song of the humpbacked whale indicates that it may be astonishingly rich in informational content – so presumably then they do gossip! And not only that, but humpbacked whales (and others of the larger whale species) share a special kind of neural cell with humans, called spindle cells. So might we gradually discover that humpbacked whales are equally as smart as humans? Oh come, come – let’s not get too carried away!

*

Do you remember a story about the little boy who fell into a zoo enclosure, whereupon he was rescued and nursed by one of the gorillas? It was all filmed, and not once but twice in fact – on different occasions and involving different gorillas, Jambo 5 and Binti Jua. 6 After these events, some in the scientific community sought to discount the evidence of their own eyes (even though others who’d worked closely with great apes saw nothing which surprised them at all). The gorillas in question, these experts asserted, evidently mistook the human child for a baby gorilla. Stupidity rather than empathy explained the whole thing. 7

Scientists are rightly cautious, of course, when attributing human motives and feelings to explain animal behaviour, however, strict denial of parallels which precludes all recognition of motives and feelings aside from those of humans becomes reductio ad absurdum. Such an overemphasis on the avoidance of anthropomorphism is no measure of objectivity and leads us just as assuredly to willful blindness as naïve sentimentality can. Indeed, to arrogantly presume that our closest evolutionary relatives, with whom we share the vast bulk of our DNA, are so utterly different that we must deny the most straightforward evidence of complex feelings and emotions reflects very badly upon us.

But then why stop with the apes? Dolphins are notoriously good at rescuing stranded swimmers, and if it wasn’t so terribly anthropomorphising I’d be tempted to say that they sometimes seem to go out of their way to help. Could it be that they find us intriguing, or perhaps laughable, or even pathetic (possibly in both senses)? – Adrift in the sea and barely able to flap around. “Why do humans decide to strand themselves?” they may legitimately wonder.

Dogs too display all the signs of liking us, or fearing us, and, at other times, of experiencing pleasure and pain, so here again what justification do those same scientists have to assume their expressions are mere simulacra? And do the birds really sing solely to attract potential mates and to guard their territory? Is the ecstatic trilling of the lark nothing more than a pre-programmed reflex? Here is what the eminent Dutch psychologist, primatologist and ethologist, Frans B.M. de Waal, has to say:

“I’ve argued that many of what philosophers call moral sentiments can be seen in other species. In chimpanzees and other animals, you see examples of sympathy, empathy, reciprocity, a willingness to follow social rules. Dogs are a good example of a species that have and obey social rules; that’s why we like them so much, even though they’re large carnivores.” 8

Here’s an entertaining youtube clip showing how goats too sometimes like to have a good time:

Rather than investigating the ample evidence of animal emotions, for too long the scientific view has been focused on the other end of the telescope. So we’ve had the behaviourists figuring that if dogs can be conditioned to salivate to the sound of bells then maybe children can be similarly trained, even to the extent of learning such unnecessary facts and skills (at least from a survival point of view) as history and algebra. Whilst more recently, with the behaviourists having exited the main stage (bells ringing loudly behind) a new wave of evolutionary psychologists has entered, and research is on-going; a search for genetic propensities for all traits from homosexuality and obesity, to anger and delinquency. Yes, genes for even the most evidently social problems, such as criminality, are being earnestly sought after, so desperate is the need of some to prove we too are nothing more than complex reflex machines; dumb robots governed by our gene-creators, much as Davros operates the controls of the Daleks. In these ways we have demoted our own species to the same base level as the supposedly automata beasts.

Moreover, simply to regard every non-human animal as a being without sentience is scientifically unfounded. If anything it is indeed based on a ‘religious’ prejudice; one derived either directly from orthodox faith, or as a distorted refraction via our modern faith in humanism. But it is also a prejudice that leads inexorably into a philosophical pickle, inspiring us to draw equally dopey mechanical caricatures of ourselves.

*

So what is Darwin’s final legacy? Well, that of course remains unclear, and though it is established that his conjectured mechanism for the development and diversity of species is broadly correct, this is no reason to believe that the whole debate is completely done and dusted. And since Darwin’s theory of evolution has an in-built bearing on our relationship to the natural world, and by interpolation, to ourselves, we would be wise to recognise its limitations.

Darwinism offers satisfactory explanations to a great many questions. How animals became camouflaged. Why they took to mimicry. What causes peacocks to grow such fabulous tails – or at least why their fabulous tails grow so prodigiously large. It also helps us to understand a certain amount of animal behaviour. Why male fish more often look after the young than males of other phylum. Why cuckoos lay their eggs in the nests of other birds. And why the creatures that produce the largest broods are most often the worst parents.

Darwinism also makes a good account of a wide range of complex and sophisticated human emotions. It copes admirably with nearly all of the seven deadly sins. Gluttony, wrath, avarice and lust present no problems at all. Sloth is a little trickier, though once we understand the benefits of conserving energy, it soon fits into place, whilst envy presumably encourages us to strive harder. Pride is perhaps the hardest to fathom, since it involves an object of affection that hardly needs inventing, at least from a Darwinian perspective. But I wish to leave aside questions of selfhood for later.

So much for the vices then, but what of the virtues. How, for example are Darwinians able to account for rise of more altruistic behaviour? And for Darwinian purists, altruism arrives as a bit of a hot potato. Not that altruism is a problem in and of itself, for this is most assuredly not the case. Acts of altruism between related individuals are to be expected. Mothers that did not carry genes to make them devoted toward their own children would be less likely to successfully pass on their genes. The same may be said for natural fathers, and this approach can be intelligently elaborated and extended to include altruism within larger, and less gene-related groups. It is a clever idea, one that can be usefully applied to understanding the organisation of various communities, including those of social insects such as bees, ants, termites and, of course, naked mole rats…! Yes, as strange as it may sound, one special species of subterranean rodents, the naked mole rats, have social structures closely related to those of the social insects, and the Darwinian approach explains this too, as Dawkins brilliantly elucidates in a chapter of his book The Selfish Gene. Yet there remains one puzzle that refuses such insightful treatment.

When I was seventeen I went off cycling with a friend. On the first day of our adventures into the wilderness that is North Wales, we hit a snag. Well, actually I hit a kerb, coming off my bike along a fast stretch of the A5 that drops steeply down into Betws-y-Coed – a route that my parents had expressly cautioned me not to take, but then as you know, boys will be boys. Anyway, as I came to a long sliding halt along the pavement (and not the road itself, as luck would have it), I noticed that a car on the opposite side had pulled up. Soon afterwards, I was being tended to by a very kindly lady. Improvising first aid using tissues from a convenient packet of wet-wipes, she gently stroked as much of the gravel from my wounds as she could. She calmed me, and she got me back on my feet, and without all her generous support we may not have got much further on our travels. I remain very grateful to this lady, a person who I am very unlikely to meet ever again. She helped me very directly, and she also helped me in another way, by teaching me one of those lessons of life that stick. For there are occasions when we all rely on the kindness of strangers, kindness that is, more often than not, as freely given as it is warmly received. Yet even such small acts of kindness pose a serious problem for Darwinian theory, at least, if it is to successfully explain all forms of animal and human behaviour. The question is simply this: when there is no reward for helping, why should anyone bother to stop?

Dawkins’ devotes an entire chapter of The Selfish Gene to precisely this subject. Taking an idea from “game theory” called “the prisoner’s dilemma”, he sets out to demonstrate that certain strategies of life that aim toward niceness are actually more likely to succeed than other more cunning and self-interested alternatives. His aim is to prove that contrary to much popular opinion “nice guys finish first”. But here is a computer game (and a relatively simple one at that), whereas life, as Dawkins knows full well, is neither simple nor a game. In consequence, Dawkins then grasps hold of another twig. Pointing out how humans are a special case – as if we needed telling…

As a species, he says, we have the unique advantage of being able to disrespect the programming of our own selfish genes. For supporting evidence he cites the use of contraception, which is certainly not the sort of thing that genes would approve of. But then why are we apparently unique in having this ability to break free of our instinctual drives? Dawkins doesn’t say. There is no explanation other than that same old recourse to just how extraordinarily clever we are – yes, we know, we know! Yet the underlying intimation is really quite staggering: that human beings have evolved to be so very, very, very clever, that we have finally surpassed even ourselves.

As for such disinterested acts of altruism, the kind of instance exemplified by the Samaritanism of my accidental friend, these, according to strict Darwinians such as Dawkins, must be accidents of design. A happy bi-product of evolution. A spillover. For this is the only explanation that evolutionary theory in its current form could ever permit.

Embedded below is one of a series of lectures given by distinguished geneticist and evolutionary biologist Richard Lewontin in 1990. The minutely detailed case he makes against the excesses of a Darwinian approach to human behaviour, as well as the latent ideology of socio-biology, is both lucid and persuasive:

*

Allow me now to drop a scientific clanger. My intention is to broaden the discussion and tackle issues about what Darwinism has to say about being human, and no less importantly, about being animal or plant. To this end then, I now wish to re-evaluate the superficially religious notion of “souls”; for more or less everything I wish to say follows from consideration of this apparently archaic concept.

So let me begin by making the seemingly preposterous and overtly contentious statement that just as Darwin’s theory in no way counters a belief in the existence of God, or gods as such, likewise, it does not entirely discredit the idea of souls. Instead, Darwin has eliminated the apparent need for belief in the existence of either souls or gods. But this is in no means the same as proving they do not exist.

Now, by taking a more Deistic view of Creation (as Darwin more or less maintained until late in his own life), one may accept the point about some kind of godly presence, for there is certainly room for God as an original creative force, and of some ultimately inscrutable kind, and yet it may still be contended that the idea of souls has altogether perished. For evolutionary theory establishes beyond all reasonable doubt that we are fundamentally no different from the other animals, or in essence from plants and bacteria. So isn’t it a bit rich then, clinging to an idea like human souls? Well, yes, if you put it that way, though we may choose to approach the same question differently.

My contention is that ordinary human relations already involves the notion of souls, only that we generally choose not to use the word soul in these contexts, presuming it to be outmoded and redundant. But perhaps given the religious weight of the word this will seem a scandalous contention, so allow me to elucidate. Everyday engagement between human beings (and no doubt other sentient animals), especially if one is suffering or in pain, automatically involves the feeling of empathy. So what then is the underlying cause of our feelings of empathy? – Only the most hard-nosed of behaviourists would dismiss it as a merely pre-programmed knee-jerk response.

Well, empathy, almost by definition, must mean that, in the other, we recognise a reflection of something found within ourselves. But then, what is it that we are seeing reflected? Do we have any name for it? And is not soul just as valid a word as any other? Or, to consider a more negative context, if someone commits an atrocity against others, then we are likely to regard this person as wicked. We might very probably wish to see this person punished. But how can anyone be wicked unless they had freedom to choose otherwise? So then, what part of this person was actually free? Was it the chemical interactions in their brain, or the electrical impulses between the neurons, or was it something altogether less tangible? And whatever the cause, we cannot punish the mass of molecular interactions that comprises their material being, because punishment involves suffering and molecules are not equipped to suffer. So ultimately we can only punish “the person within the body”, and what is “the person within the body” if not their soul?

But why is it, you may be wondering, that I want to rescue the idea of souls at all. For assuredly you may argue – and not without sound reason – that you have no want nor need for any woolly notions such as soul or spirit to encourage you to become an empathetic and loving person. You might even add that many of the cruellest people in history believed in the existence of the human soul. And I cannot counter you on either charge.

But let’s suppose that finally we have banished all notions of soul or spirit completely and forever – what have we actually achieved? And how do we give a fair account for that other quite extraordinary thing which is ordinary sentience. For quite aside from the subtle complexity of our moods and our feelings of beauty, of sympathy, of love, we must first account for our senses. Those most primary sensory impressions that form the world we experience – the redness of red objects, the warmth of fire, the saltiness of tears – the inexpressible, immediate, and ever-present streaming experience of conscious awareness that philosophers have called qualia. If there are no souls then what is actually doing the experiencing? And we should remember that here “the mind” is really nothing more or less, given our current ignorance, than a quasi-scientific synonym for soul. It is another name for the unnailable spook.

Might we have developed no less successfully as dumb automata? There is nothing in Darwin or the rest of science that calls on any requirement for self-conscious awareness to ensure our survival and reproduction. Nothing to prevent us negotiating our environment purely with sensors connected to limbs, via programmed instructions vastly more complex yet inherently no different from the ones that control this word processor, and optimised as super-machines that have no use for hesitant, stumbling, bumblingly incompetent consciousness. So what use is qualia in any case?

In purely evolutionary terms, I don’t need to experience the sensation of red to deal with red objects, any more than I need to see air in order to breathe. Given complex enough programs and a few cameras, future robots can (and presumably will) negotiate the world without need of actual sensations, let alone emotions. And how indeed could the blind mechanisms of dumb molecules have accidentally arranged into such elaborate forms to enable cognitive awareness at all? Darwin does not answer these questions – they fall beyond his remit. But then no one can answer these questions (and those who claim reasons to dismiss qualia on philosophical grounds, can in truth only dismiss the inevitably vague descriptions, rather than the ever-present phenomenon itself – or have they never experienced warmth, touched roughness nor seen red?).

And so the most ardent of today’s materialists wish to go further again. They want to rid the world of all speculation regarding the nature of mind. They say it isn’t a thing at all, but a process of the brain, which is conceivably true. (Although I’d add why stop at the brain?)

One fashionable idea goes that really we are “minding”, which is interesting enough given our accustomed error of construing the world in terms of objects rather actions; nouns coming easier than verbs to most of us. But then, whether the mind might be best represented by a noun or a verb seems for now, and given that we still know next to nothing in any neurological sense, to be purely a matter of taste.

The modern reductionism that reduces mind to brain, often throws up an additional claim. Such material processes, it claims, will one day be reproduced artificially in the form of some kind of highly advanced computer brain. Well, perhaps this will indeed happen, and perhaps one day we really will have “computers” that actually experience the world, rather than the sorts of machines today that simply respond to sensors in increasingly complex ways. I am speculating about machines with qualia: true artificial brains that are in essence just as aware as we are. But then how will we know?

Well, that’s a surprisingly tricky question and it’s one that certainly isn’t solved by the famous Turing Test, named after the father of modern computing, Alan Turing. For the Turing Test is merely a test of mimicry, claiming that if one day a computer is so cunningly programmed that it has become indistinguishable from a human intelligence then it is also equivalent. But that of course is nonsense. It is nonsense that reminds me of a very cunning mechanical duck someone once made: one that could walk like a duck, quack like a duck, and if rumours are to be believed, even to crap like a duck. A duck, however, it was not, and nor could it ever become one no matter how elaborate its clockwork innards. And as with ducks so with minds.

But let’s say we really will produce an artificial mind, and somehow we can be quite certain that we really have invented just such an incredible, epoch-changing machine. Does this mean that in the process of conceiving and manufacturing our newly conscious device, we must inevitably learn what sentience is of itself? This is not a ridiculous question. Think about it: do you need to understand the nature of light in order to manufacture a light bulb? No. The actual invention of light bulbs precedes the modern physical understanding. And do we yet have a full understanding of what light truly is, and is such a full understanding finally possible at all?

Yet there are a few scientists earnestly grappling with questions of precisely this kind, venturing dangerously near the forests and swamps of metaphysics, in search of answers that will require far better knowledge and understanding of principles of the mind. Maybe they’ll even uncover something like “the seat of the soul”, figuring out from whence consciousness springs. Though I trust that you will not misunderstand me here, for it is not that I advocate some new kind of reductionist search for the soul within, by means of dissection or the application of psychical centrifuges using high strength magnetic fields or some such. As late as the turn of the twentieth century, there was indeed a man called Dr. Duncan MacDougall, who had embarked on just such a scheme: weighing people at the point of death, in experiments to determine the mass of the human soul. 9 A futile search, of course, for soul – or mind – is unlikely to be in, at least in the usual sense, a substantial thing. And though contingent with life, we have no established evidence for its survival into death.

My own feeling is that the soul is no less mortal than our brains and nervous systems, on which it seemingly depends. But whatsoever it turns out to be, it is quite likely to be remain immeasurable – especially if we choose such rudimentary apparatus as a set of weighing scales for testing it. The truth is that we know nothing as yet, for the science of souls (or minds if you prefer) is still without its first principle. So the jury is out on whether or not science will ever explain what makes a human being a being at all, or whether is it another one of those features of existence that all philosophy is better served to “pass over in silence”.

Here is what respected cognitive scientist Steven Pinker has to say of sentience in his entertainingly presented and detailed overview of our present understanding of How the Mind Works:

“But saying that we have no scientific explanation of sentience is not the same as saying that sentience does not exist at all. I am as certain that I am sentient as I am certain of anything, and I bet you feel the same. Though I concede that my curiosity about sentience may never be satisfied, I refuse to believe that I am just confused when I think I am sentient at all! … And we cannot banish sentience from our discourse or reduce it to information access, because moral reasoning depends on it. The concept of sentience underlies our certainty that torture is wrong and that disabling a robot is the destruction of property but disabling a person is murder.” 10

*

There is a belief that is common to a camp of less fastidious professional scientists than Pinker, which, for the sake of simplicity, holds that consciousness, if it was ever attached at all, was supplied by Nature as a sort of optional add-on, in which every human experience is fully reducible to an interconnected array of sensory mechanisms and data-processing systems. Adherents to this view tend not to think too much about sentience, of course, and in rejecting their own central human experience, thereby commit a curiously deliberate act of self-mutilation that leaves only zombies fit for ever more elaborate Skinner boxes 11, even when, beyond their often clever rationalisations, we all share a profound realisation that there is far more to life than mere stimulus and response.

Orwell, wily as ever, was alert to such dangers in modern thinking, and reworking a personal anecdote into grim metaphor, he neatly presented our condition:

“… I thought of a rather cruel trick I once played on a wasp. He was sucking jam on my plate, and I cut him in half. He paid no attention, merely went on with his meal, while a tiny stream of jam trickled out of his severed œsophagus. Only when he tried to fly away did he grasp the dreadful thing that had happened to him. It is the same with modern man. The thing that has been cut away is his soul, and there was a period — twenty years, perhaps — during which he did not notice it.”

Whilst Orwell regards this loss as deeply regrettable, he also recognises that it was a very necessary evil. Given the circumstances, giving heed to how nineteenth century religious belief was “…in essence a lie, a semi-conscious device for keeping the rich rich and the poor poor…” he is nevertheless dismayed how all too hastily we’ve thrown out the baby with the holy bathwater. Thus he continues:

“Consequently there was a long period during which nearly every thinking man was in some sense a rebel, and usually a quite irresponsible rebel. Literature was largely the literature of revolt or of disintegration. Gibbon, Voltaire, Rousseau, Shelley, Byron, Dickens, Stendhal, Samuel Butler, Ibsen, Zola, Flaubert, Shaw, Joyce — in one way or another they are all of them destroyers, wreckers, saboteurs. For two hundred years we had sawed and sawed and sawed at the branch we were sitting on. And in the end, much more suddenly than anyone had foreseen, our efforts were rewarded, and down we came. But unfortunately there had been a little mistake. The thing at the bottom was not a bed of roses after all, it was a cesspool full of barbed wire.” 12

On what purely materialistic grounds can we construct any system of agreed morality? Do we settle for hedonism, living our lives on the unswerving pursuit of personal pleasure; or else insist upon the rather more palatable, though hardly more edifying alternative of eudaemonism, with its eternal pursuit of individual happiness? Our desires for pleasure and happiness are evolutionarily in-built, and it is probably fair to judge that most, if not all, find great need of both to proceed through life with any healthy kind of disposition. Pleasure and happiness are wonderful gifts, to be cherished when fortune blows them to our shore. Yet pleasure is more often short-lived, whilst happiness too is hard to maintain. So they hardly stand as rocks, providing little in the way of stability if we are to build solidly from their foundations. Moreover, they are not, as we are accustomed to imagine, objects to be sought after at all. If we chase either one then it is perfectly likely that it will recede ever further from our reach. So it is better, I believe, to look upon these true gifts as we find them, or rather, as they find us: evanescent and only ever now. Our preferred expressions of the unfolding moment of life. To measure our existence solely against them is however, to miss the far bigger picture of life, the universe and everything. 13

We might decide, of course, to raise the social above these more individualistic pursuits: settling on the Utilitarian calculus of increased happiness (or else reduced unhappiness) for the greatest number. But here’s a rough calculation, and one that, however subtly conceived, never finally escapes from its own deep moral morass. For Utilitarianism, though seeking to secure the greatest collective good, is by construction, blind to all evils as such, being concerned always and only in determining better or worse outcomes. The worst habit of Utilitarianism is to preference ends always above means. Lacking moral principle, it grants licence for “necessary evils” of every prescription: all wrongs being weighed (somehow) against perceived benefits.

We have swallowed a great deal of this kind of poison, so much that we feel uncomfortable in these secular times to speak of “acts of evil” or of “wickedness”. As if these archaic terms might soon be properly expurgated from our language. Yet still we feel the prick of our own conscience. A hard-wired sense of what is most abhorrent, combined with an innate notion of justice that once caused the child to complain “but it isn’t fair… it isn’t fair!”  Meanwhile, the “sickness” in the minds of others makes us feel sick in turn.

On what grounds can the staunchest advocates of materialism finally challenge those who might turn and say: this baby with Down’s Syndrome, this infant with polio, this old woman with Parkinson’s Disease, this schizophrenic, these otherwise healthy but unwanted babies or young children, haven’t they already suffered enough? And if they justify a little cruelty now in order to stave off greater sufferings to come, or more savagely still, claim that the greater good is served by the painless elimination of a less deserving few. What form should our prosecution take? By adopting a purely materialistic outlook then, we are collectively drawn, whether we wish it or not, toward the pit of nihilism. Even the existentialists, setting off determined to find meaning in the here and now, sooner or later recognised the need for some kind of transcendence, or else abandoned all hope.

*

Kurt Vonnegut was undoubtedly one of the most idiosyncratic of twentieth century writers. 14 During his lifetime, Vonnegut was often pigeonholed as a science fiction writer, and this was no doubt because his settings are very frequently in some way futuristic, because as science fiction goes, his stories are generally rather earth-bound. In general, Vonnegut seems more preoccupied with the unlikely interactions between his variety of freakish characters (many of whom reappear in different novels), than in using his stories as a vehicle to project his vision of the future itself. Deliberately straightforward, his writing is ungarnished and propelled by sharp, snappy sentences. He hated semi-colons, calling them grammatical hermaphrodites.

Vonnegut often used his talented imagination to tackle the gravest of subjects, clowning around with dangerous ideas, and employing the literary equivalent of slapstick comedy to puncture human vanity and to make fun of our grossest stupidities. He liked to sign off chapters with a hand-drawn asterisk, because he said it represented his own arsehole. As a satirist then, he treads a path that was pioneered by Swift and Voltaire; of saying the unsayable but disguising his contempt under the cover of phantasy. He has become a favourite author of mine.

In 1996, he was awarded the title of American Humanist of the Year. In his acceptance speech, he took the opportunity to connect together ideas that had contributed to his own understanding of what it meant to be a humanist; ideas that ranged over a characteristically shifting and diverse terrain. Here were his concluding remarks:

“When I was a little boy in Indianapolis, I used to be thankful that there were no longer torture chambers with iron maidens and racks and thumbscrews and Spanish boots and so on. But there may be more of them now than ever – not in this country but elsewhere, often in countries we call our friends. Ask the Human Rights Watch. Ask Amnesty International if this isn’t so. Don’t ask the U.S. State Department.

And the horrors of those torture chambers – their powers of persuasion – have been upgraded, like those of warfare, by applied science, by the domestication of electricity and the detailed understanding of the human nervous system, and so on. Napalm, incidentally, is a gift to civilization from the chemistry department of Harvard University.

So science is yet another human-made God to which I, unless in a satirical mood, an ironical mood, a lampooning mood, need not genuflect.” 15

*

Rene Descartes is now most famous for having declared, “cogito ergo sum”, which means of course “I think therefore I am”. It was a necessary first step, or so he felt, to escape from the paradox of absolute skepticism, which was the place he had chosen to set out at the beginning of his metaphysical meditations. What Descartes was basically saying was this: look here, I’ve been wondering whether I exist or not, but now having caught myself in the act, I can be sure that I do – for even if I still must remain unsure of everything else besides, I cannot doubt that I am doubting. It is important to realise here that Descartes’ proposition says more than perhaps first meets the eye. After all, he intends it as a stand-alone proof and thus to be logically self-consistent, and the key to understanding how is in his use of the word “therefore”. “Therefore” automatically implying his original act of thinking. If challenged then, to say how he can be certain even in that he is thinking, Descartes’ defence relies upon the very act of thinking (or doubting, as he later put it 16) described in the proposition. Thinking is undeniable, Descartes is saying, and my being depends on this. Yet this first step is already in error, and importantly, the consequences of this error are resonant still throughout modern western thought.

Rene Descartes, a Christian brought up to believe that animals had no soul (as Christians are wont to do), readily persuaded himself that they therefore felt no pain. It was a belief that permitted him to routinely perform horrific experiments in vivisection (he was a pioneer in the field). I mention this because strangely, and in spite of Darwin’s solid refutation of man’s pre-eminence over beasts, animal suffering is still regarded as entirely different in kind to human suffering, even in our post-Christian society. And I am sorry to say that scientists are hugely to blame for this double standard. Barbaric experimentation, most notoriously in the field of psychology, alongside unnecessary tests for new products and new weapons, are still performed on every species aside from ours, whilst in more terrible (and shamefully recent) times, when scientists were afforded licence to redraw the line above the species level, their subsequent demarcations made on grounds of fitness and race, the same cool-headed objectivity was applied to the handicapped, to prisoners of war, and to the Jews. It is better that we never forget how heinous atrocities have too often been committed in the name and pursuit of coldly rational science.

Rene Descartes still has a role to play in this. For by prioritising reason in order to persuade himself of his own existence, he encouraged us to follow him into error. To mix up our thinking with our being. To presume that existence is somehow predicated on reasoning, and not, at least not directly, because we feel, or because we sense, or most fundamentally, because we are.  If it is rationality that sets us apart from the beasts, then we exist in a fuller sense than the beasts ever can.

To be absolutely certain of the reality of a world beyond his mind, however, Descartes needed the help of God.  Of a living God of Truth and Love. For if were it not for the certainty of God’s existence, Descartes argued, his mind – though irrefutably extant – might yet be prey to the illusions of some kind of a “deceitful daemon”. Being nothing more than a brain in a tank, to give his idea a modern slant, and plugged into what today would most probably be called The Matrix.

Thus realising that everything he sensed and felt might conceivably be an elaborately constructed illusion, only Descartes’ profound knowledge of a God of Truth – a God who made the world as true and honest as it appeared to be – could save his philosophy from descent into pure solipsism. But this primary dualism of mind and world is itself the division of mind and body – a division of self – while to regard Reason as the primary and most perfect attribute of being, obviously established the mind above the body, and, more generally, spirit above matter. This is the lasting lesson Descartes taught and it is a lesson we have committed so deeply to our Western consciousness that we have forgotten we ever learnt it in the first place.

The significant difference in today’s world of science, with God now entirely outside of the picture, is that Descartes’ hierarchy has been totally up-ended. Matter is the new boss, and mind, its servant. 17

*

But we might also turn this whole issue on its head. We might admit the obvious. Concede that although we don’t know what it is exactly, there is some decidedly strange and immaterial part to ourselves. That it is indeed the part we most identify with – the part we refer to so lovingly as “I”. And that it is this oh-so mysterious part of us which provides all our prima facie evidence for existence itself. Though in admitting this, the question simply alters. It becomes: how to account for the presence of such a ghost inside our machines? For what outlandish contrivance would we need to reconnect the matter of our brains with any such apparently in-dwelling spirit? And whereas Rene Descartes once proposed that mind and body might be conjoined within the mysterious apparatus of our pineal gland (presumably on the grounds that the pineal gland is an oddly singular organ), we know better and so must look for less localised solutions. In short then, we may finally need to make a re-evaluation of ourselves, not merely as creatures, but as manifestations of matter itself.

Yet, in truth, all of this is really a Judeo-Christian problem; a deep bisection where other traditions never made any first incision. For what is “matter” in any case? Saying it’s all atoms and energy doesn’t give a final and complete understanding. Perhaps our original error was to force such an irreconcilable divorce between nebulous soul (or mind) and hard matter, when they are so indivisibly and gloriously codependent, for though Science draws a marked distinction between the disciplines of physics and psychology, it only stands for sake of convenience; for sake, indeed, of ignorance.

To begin then, let’s try to re-establish some sense of mystery regarding the nature of matter itself – such everyday stuff that we have long taken for granted that through careful measurements and mathematical projections its behaviour can be understood and predicted. Here indeed, Freeman Dyson brings his own expertise in quantum theory, combined with his genius for speculation, to consider the fascinating subject of mind and its relationship to matter:

“Atoms in the laboratory are weird stuff, behaving like active agents rather than inert substances. They make unpredictable choices between alternative possibilities according to the laws of quantum mechanics. It appears that mind, as manifested by the capacity to make choices, is to some extent inherent in every atom. The universe as a whole is also weird, with laws of nature that make it hospitable to the growth of mind.”

Dyson is drawing upon his very deep understanding of quantum physics, and yet already he has really said too much. Quantum choice is not the same as human choice. Quantum choice depends on random chance, which is the reason Einstein famously asserted, “God does not play dice”. Indeed I’m not sure how quantum theory, as it is currently understood, could ever account for the existence of free will and volition, quite aside from the overriding mystery of sentience itself. So Dyson’s more important point is perhaps his last one: that the universe is “hospitable for the growth of mind”. This is too often overlooked. And for Dyson, it offers reason enough for religious contemplation:

“I do not make any clear distinction between mind and God. God is what mind becomes when it has passed beyond the scale of our comprehension. God may be either a world-soul or a collection of world-souls. So I am thinking that atoms and humans and God may have minds that differ in degree but not in kind.” 18

I share with Dyson the opinion that it is better to relish these mysteries rather than to retreat to the dry deception of material certainty. For, as Shakespeare summed up so marvelously in his final play The Tempest: “we are such stuff as dreams are made on…”19 And perhaps this is still the best description we have of ourselves, even though we have no idea whatsoever, how as dream-machines, our dreams are woven.

A toast then! Feel free to join me in raising your glass… to your own mind, your psyche, your soul, call it what you will – a rose by any other name and all that. Three cheers! And to consciousness! To sentience! To uncanny awareness! That same stuff all our dreams are made on…

So with great appreciation and warm affection, here’s to that strangest of things: that thing I so very casually call my-self! But even more than this. To the actual stuff of our lives, to the brain, the entire central nervous system and far beyond. To the eyes and ears and fingertips; to the whole apparatus of our conscious awareness; and to the sentience of all our fellows, whether taking human or other forms! To the strangeness of the material world itself, from which all sentience has miraculously sparked! To the vast and incomprehensible Universe no less, whether manifestly inward or outward, for the distinction may be a finer one than we are in the habit to presume! Here’s to wondering what we are… Drink up!

Next chapter…

*

John Searle is a philosopher who has closely studied the nature of consciousness and concludes that although unique amongst biological phenomena, mind, though mysterious, is obviously a natural function of brain activity. In this lecture he summarises the many failures of the current “scientific” approach to questions of consciousness:

In the interview below Searle discusses why he rejects both the hard-line materialist dismissal of consciousness as an illusion (which is actually nonsensical) and dualist alternatives that rely upon a false division between mind and matter:

And finally, Searle outlines the main difficulties surrounding the unresolved philosophical paradox of free will – put succinctly he says although it is impossible to prove human beings have free will and any capacity for free will also seems to defy physical causality, we are compelled to experience conscious rational decision-making on a daily basis:

*

Addendum: the return of Frankenstein!

The issues surrounding the use of genetically modified organisms (GMOs) are many and complex, but it is perfectly clear that new developments in genetics, like those in nuclear physics more than half a century ago, have automatically opened the door to some quite extraordinary possibilities. Possibilities that will most assuredly impact our future no less dramatically than the advent of atomic reactors and the hydrogen bomb impacted our very recent past – and still continue to affect us today.

The need for a proper debate is long overdue but, hardly surprisingly, the huge bio-tech corporations prefer to keep the debate closed down. Monsanto, for instance, who claim that its perfectly safe to release their GMOs directly into our environment, were also in the habit of claiming that their herbicide Roundup is so harmless you can drink it! 20 But then why on earth would anyone (or at least anyone not in their pocket) trust such self-interested and deliberately compromised risk assessments? The short answer is that the precautionary principle has once again been overridden by money and influence.

What we really need, of course, is a proper debate about the use of genetic modification. A debate that is open and public: a forum for discussion amongst leading experts (and especially those not associated with the powerful bio-tech firms); scientists from other fields, who though ignorant on specifics, might bring a detached expertise by virtue of familiarity with scientific procedures; alongside representatives from other interested parties such as ‘consumers’ (that’s the rest of us by the way – we all consume, and though I hate the word too, it at least offers a little better perspective on our role without the current system, since this is how the system itself defines us).

This great debate needs to be fully inclusive, welcoming intelligent opinion, whether concordant or dissenting. No reasoned objections from any quarters being summarily dismissed as unscientific or anti-scientific, as is so often the case, because we must never leave it for technicians alone to decide on issues that so directly affect our common future. Relying on highly specialised experts alone – even when those experts are fully independent (as they so rarely are these days) –  would be as unwise as it is anti-democratic.

Genetic manipulation is already upon us. It is already helping in the prevention and treatment of diseases, and in the production of medicines such as insulin (although even here serious questions are arising with regards to the potentially harmful side-effects of using a genetically modified product). More controversial again is the development of pest- and drought-resistant strains of crops; developments that are claimed by their producers to have alleviated a great deal of human suffering already, but which seem to have brought misery of new kinds – I will come back to this later.

And then we come to the development of Genetic Use Restriction Technology (Gurt), better known as ‘suicide’ or ‘Terminator’ (to use the industry term) seeds, which are promoted by the industry as a ‘biosafety’ solution. Engineered sterility being a clever way of preventing their own genetically modified plants from causing unwanted genetic contamination – which we might think of as a new form of pollution. The argument being that if modified genes (whether pharmaceutical, herbicide resistance or ‘Terminator’ genes) from a ‘Terminator’ crop get transferred to related plants via cross-pollination, the seed produced from such pollination will be sterile. End of problem.

But this is merely an excuse, of course, and if used in this way, the new technology will ultimately prevent over a billion of the poorest people in the world from continuing in their age-old practice of saving seeds for resowing, which will, as a consequence, make these same farmers totally dependent on a few multinational bio-tech companies. All of which serves as an excellent means for monopolising the world’s food supplies, and offers a satisfactory solution only for the owners of companies like Monsanto. 21

In any case, do we really wish to allow patents on specific genes, opening the door to the corporate ownership of the building blocks to life itself? The world renowned physicist and futurist visionary Freeman Dyson draws a direct comparison to earlier forms of slavery:

“The institution of slavery was based on the legal right of slave-owners to buy and sell their property in a free market. Only in the nineteenth century did the abolitionist movement, with Quakers and other religious believers in the lead, succeed in establishing the principle that the free market does not extend to human bodies. The human body is God’s temple and not a commercial commodity. And now in the twenty-first century, for the sake of equity and human brotherhood, we must maintain the principle that the free market does not extend to human genes.” 22

Nor, I would quickly add, should it extend to the ownership of genes of other higher species of animal or plant life. Moreover, I personally have no wish whatsoever for apples, tomatoes, potatoes (or even tobacco) that provides the RDA for all my nutritional needs, or any other supposed improvement on the original designs – preferring to trust to apples, tomatoes and potatoes that evolved alongside my own human digestive system. And this ought not to be treated as merely a preference, but established as a human right, since we all have the right not to eat GMO just as we have the right to be vegan (not that I’m a vegan, by the way).

Beyond this, we also need to consider the many perfectly serious and inescapable ethical issues that arise once you are tinkering with the primary source code of life itself. Take cloning as an interesting example.

Identical twins are essentially clones, having both developed from the same fertilised egg, and thus sharing the same DNA. But then nature sometimes goes one step further again:

A form of virgin birth has been found in wild vertebrates for the first time.

Researchers in the US caught pregnant females from two snake species and genetically analysed the litters.

That proved the North American pit vipers reproduced without a male, a phenomenon called facultative parthenogenesis that has previously been found only in captive species. 23

I have since learned that parthenogenesis (reproduction without fertilisation or “virgin birth”) is surprisingly common throughout the plant and animal kingdoms. Birds do it, bees do it… and even mammals have been induced to do it. So cloning is not inherently unnatural, and if carried out successfully (as it frequently is in nature), it may one day be no more harmful nor fraught with latent dangers to be a cloned individual than an individual produced by other forms of artificial reproduction. Furthermore, since we already know what human twins are like, we already know what human clones will be like. Yet many ethical questions still hang.

For instance, should anyone be allowed to clone themselves? Or more generally, who chooses which of us are to be cloned? Do we just leave it to the market to decide? And why would we ever want a world populated by identical (or rather, approximately identical – since no two twins are truly identical and there are sound biological reasons for believing clones will never be perfectly reproduced either) human beings? Such ethical questions are forced by the new biotechnologies. And there are many further reasons for why ordinary, intelligent public opinion needs to be included in the debate.

Here is Freeman Dyson again, summarising his own cautious optimism as we enter the age of the new ‘green technologies’:

“I see two tremendous goods coming from biotechnology in the next century, first the alleviation of human misery through progress in medicine, and second the transformation of the global economy through green technology spreading wealth more equitably around the world. The two great evils to be avoided are the use of biological weapons and the corruption of human nature by buying and selling genes. I see no scientific reason why we should not achieve the good and avoid the evil.

The obstacles to achieving the good are political rather than technical. Unfortunately a large number of people in many countries are strongly opposed to green technology, for reasons having little to do with the real dangers. It is important to treat the opponents with respect, to pay attention to their fears, to go gently into the new world of green technology so that neither human dignity nor religious conviction is violated. If we can go gently, we have a good chance of achieving within a hundred years the goals of ecological sustainability and social justice that green technology brings within our reach.” 24

Dyson is being too optimistic no doubt with many of the dangers of GMOs slowly coming to light more two decades after Dyson uttered these words as part of his acceptance speech for the award of the Templeton Prize in 2000.

Meanwhile in 2012, Greenpeace issued the following press release. It contains the summary of an open letter sent by nearly a hundred Indian scientists to the Supreme Court of India:

An official report submitted by the technical Expert committee set up by the Supreme Court of India comprising of India’s leading experts in molecular biology, toxicology and biodiversity – unanimously recommends a 10-year moratorium on all field trials of GM Bt [insecticide producing due to genes from Bacillus thuringiensis] food crops, due to serious safety concerns. The committee has also recommended a moratorium on field trials of herbicide tolerant crops until independent assessment of impact and suitability, and a ban on field trials of GM crops for which India is center of origin and diversity.

The report’s recommendations are expected put a stop to all field releases of GM food crops in India, including the controversial Bt eggplant, whose commercial release was put under an indefinite moratorium there last February 2010. Contrarily, the same Bt eggplant is currently being evaluated for approval in the Philippines.

“This official unanimous declaration on the risks of GMOs, by India’s leading biotech scientists is the latest nail on the coffin for GMOs around the world,” said Daniel M. Ocampo, Sustainable Agriculture Campaigner of Greenpeace Southeast Asia. “It is yet another proof that GMOs are bad for the health, bad for the environment, bad for farmers and bad for the economy.” 25

For though it would be foolish to fail to recognise the enormous potential benefits of some of the new ‘green technologies’, any underestimate of the hazards is sheer recklessness. And this is where my own opinion differs significantly from enthusiasts like Dyson. This science is just so brilliantly new, and so staggeringly complex. The dangers are real and very difficult to over-estimate and so public concern is fully justified whether over health and safety issues, over the politico-economic repercussions, or due to anxieties of a more purely ethical kind.

*

Please note that for the purposes of ‘publishing’ here I have taken advantage of the option to incorporate hypertext links and embed videos – in order to distinguish additional commentary from the original text all newly incorporated text has been italised.

1 There is sound evidence for believing that protons and neutrons are made of quarks, whereas electrons it seems are a type of fundamental particle which has no component parts.

2 My use of the analogue/digital comparison is simplistic, of course, but then it is only intended as a loose analogy, nothing more.

3 Since writing this I have come upon a range of so-called Young Earth Theories of Geology that contradict my former opinion. Apparently there are indeed groups of Creationists intent on disproving ideas of a 4.5 billion year old planet in favour of a ten thousand year prehistory. Needless to say there is no supporting evidence for this contention.

4

“Cro-magnons are, in informal usage, a group among the late Ice Age peoples of Europe. The Cro-Magnons are identified with Homo sapiens sapiens of modern form, in the time range ca. 35,000-10,000 b.p. […] The term “Cro-Magnon” has no formal taxonomic status, since it refers neither to a species or subspecies nor to an archaeological phase or culture. The name is not commonly encountered in modern professional literature in English, since authors prefer to talk more generally of anatomically modern humans (AMH). They thus avoid a certain ambiguity in the label “Cro-Magnon”, which is sometimes used to refer to all early moderns in Europe (as opposed to the preceding Neanderthals), and sometimes to refer to a specific human group that can be distinguished from other Upper Paleolithic humans in the region. Nevertheless, the term “Cro-Magnon” is still very commonly used in popular texts because it makes an obvious distinction with the Neanderthals, and also refers directly to people rather than to the complicated succession of archaeological phases that make up the Upper Paleolithic. This evident practical value has prevented archaeologists and human paleontologists – especially in continental Europe – from dispensing entirely with the idea of Cro-Magnons.”

Taken from The Oxford Companion to Archaeology. Oxford, UK: Oxford University Press. p. 864.

5

“Jambo, Jersey Zoos world famous and much loved silverback gorilla had a truly remarkable life. He was born in Basel Zoo in Switzerland in 1961. He arrived at Jersey Zoo on the 27th April 1972. Jambo, Swahili for Hello, is perhaps better known to the public for the gentleness he displayed towards the little boy who fell into the gorilla enclosure at Jersey Zoo one afternoon in 1986. The dramatic event hit the headlines and helped dispel the myth of gorillas as fearsome and ferocious. It was a busy Sunday afternoon in August 1986 when an incredulous public witnessed Levan Merritt a small boy from Luton UK fall into the Gorilla enclosure at Jersey Zoo. “

Extract taken from “The Hero Jambo”, a tribute to Jambo written by the founder of Jersey Zoo, Gerald Durrell.

6

“LAST SUMMER, AN APE SAVED a three-year-old boy. The child, who had fallen 20 feet into the primate exhibit at Chicago’s Brookfield Zoo, was scooped up and carried to safety by Binti Jua, an eight-year-old western lowland female gorilla. The gorilla sat down on a log in a stream, cradling the boy in her lap and patting his back, and then carried him to one of the exhibit doorways before laying him down and continuing on her way.”

Extract taken from article by F. B. M. de Waal (1997) entitled “Are we in anthropodenial? Discover 18 (7): 50-53.”

7   

“Binti became a celebrity overnight, figuring in the speeches of leading politicians who held her up as an example of much-needed compassion. Some scientists were less lyrical, however. They cautioned that Binti’s motives might have been less noble than they appeared, pointing out that this gorilla had been raised by people and had been taught parental skills with a stuffed animal. The whole affair might have been one of a confused maternal instinct, they claimed.”

Ibid.

8 Quoted in an article entitled: “Confessions of a Lonely Atheist: At a time when religion pervades every aspect of public life, there’s something to be said for a revival of pagan peevishness”, written by Natalie Angier for The New York Times Magazine, from January 14, 2001.

9 In 1907, MacDougall weighed six patients who were in the process of dying (accounts of MacDougall’s experiments were published in the New York Times and the medical journal American Medicine). He used the results of his experiment to support the hypothesis that the soul had mass (21 grams to be precise), and that as the soul departed the body, so did its mass. He also measured fifteen dogs under similar conditions and reported the results as “uniformly negative”. He thus concluded that dogs did not have souls. MacDougall’s complaints about not being able to find dogs dying of the natural causes have led at least one author to conjecture that he was in fact poisoning dogs to conduct these experiments.

10 Extract taken from Chapter 2, “Thinking Machines” of Steven Pinker’s How the Mind Works, published by Penguin Science, 1997, p 148. Italics in the original.

11 An operant conditioning chamber (sometimes known as a Skinner box) is a laboratory apparatus developed by BF Skinner, founding father of “Radical Behaviourism”, during his time as a graduate student at Harvard University. It is used to study animal behaviour and investigate the effects of psychological conditioning using programmes of punishment and reward.

12 Extract taken from Notes on the Way by George Orwell, first published in Time and Tide, London, 1940.

13  I received a very long and frank objection to this paragraph from one of my friends when they read through a draft version, which I think is worth including here in the way of balance:

“I must explain that I’m a hedonist to a ridiculous degree, so much so that my “eudaemonism” (sounds dreadful –not like happiness-seeking at all!) is almost completely bound up with the pursuit of pleasure, as for me there is little difference between a life full of pleasures and a happy life.  Mind you, pleasure in my definition (as in most people’s, I guess) covers a wide array of things: from the gluttonous through to the sensuous, the aesthetic, the intellectual and even the spiritual; and I would also say that true pleasure is not a greedy piling up of things that please, but a judicious and even artistic selection of the very best, the most refined and the least likely to cause pain as a side effect  (I think this approach to pleasure is called “Epicureanism”).

Love, of course, is the biggest source of pleasure for most, and quite remarkably, it’s not only the receiving but the giving of it that makes one truly happy, even when some pain or sacrifice is involved.  This is how I explain acts of generosity like the one you describe, by the woman who helped you when you fell off your bike as a teenager: I think she must have done it because, despite the bother and the hassle of the moment, deep down it made her happy to help a fellow human being. We have all felt this way at some point or other, and as a result I believe that pleasure is not antithetical to morality, because in fact we can enjoy being kind and it makes us unhappy to see suffering around us. This doesn’t mean that we always act accordingly, and we certainly have the opposite tendency, too: there is a streak of cruelty in every human that means under some circumstances, we’ll enjoy hurting even those we love. But my point is, hedonism and a concern for others are not incompatible. The evolutionary reason for this must be that we are a social animal, so empathy is conducive to our survival as much as aggression and competitiveness may be in some environments. In our present environment, i.e. a crowded planet where survival doesn’t depend on killing lions but on getting on with each other, empathy should be promoted as the more useful of the two impulses. This isn’t going to happen, of course, but in my opinion empathy is the one more likely to make us happy in the long run.

Having attempted to clean up the name of pleasure a bit, I’ll try to address your other complaints against a life based on such principles: “Yet pleasure is more often short-lived, whilst happiness too is hard to maintain.” I agree, and this is indeed the Achilles heel of my position: I’m the most hypochondriac and anxiety-prone person I know, because as a pleasure-a-holic and happiness junkie I dread losing the things I enjoy most. The idea of ever losing [my partner], for example, is enough to give me nightmares, and I’m constantly terrified of illness as it might stop me having my fun. Death is the biggest bogie. I’m not blessed with a belief in the afterlife, or even in the cosmic harmony of all things. This is [my partner]’s belief as far as I can tell, and I’d like to share it, but I’ve always been an irrational atheist – I haven’t arrived at atheism after careful thinking, but quite the opposite, I’ve always been an atheist because I can’t feel the godliness of things, so it is more of a gut reaction with me. The closest thing to the divine for me is in beauty, the beauty of nature and art, but whether Beauty is Truth, I really don’t know, and in any case beauty, however cosmic, won’t make me immortal in any personal or individual sense. I’m horrified at the idea of ceasing to exist, and almost as much at the almost certain prospect of suffering while in the process of dying. This extreme fear is probably the consequence of my hedonist-epicurean-eudaemonism.

On the other hand, since everyone, including the most religious and ascetic people, is to some extent afraid of dying, is it really such a big disadvantage to base one’s life on the pursuit of pleasure and happiness? I guess not, although I must admit that I’d quite like to have faith in the Beyond. I suppose that I do have some of the agnostic’s openness to the mystery of the universe – as there are so many things that we don’t understand, and perhaps we aren’t even equipped to ever understand, it’s very possible that life and death have a meaning that escapes us. This is not enough to get rid of my fears, but it is a consolation at times.

Finally, I also disagree with you when you say that pleasure and happiness “are not, as we are accustomed to imagine, objects to be sought after at all. If we chase either one then it is perfectly likely that it will recede ever further from our reach.” There’s truth in this, but I think it’s also true that unless one turns these things into a priority, it is very difficult to ever achieve them. I for one find that more and more, many circumstances in my life conspire to stop me having any fun: there are painful duties to perform, ailments to cope with, bad news on a daily basis and many other kinds of difficulties, so if I didn’t insist on being happy at least a little every day, I’d soon forget how to do it. I’m rather militant about it, in fact. I’m always treating myself in some way, though to be fair to myself, a coffee and a croissant can be enough to reconcile me to a bad day at work, for example, so I’m not really very demanding. But a treat of some sort there has to be to keep me going. Otherwise, I don’t see the point.”

14  Kurt Vonnegut had originally trained to be a scientist, but says he wasn’t good enough. His older brother Bernard trained as a chemist and is credited with the discovery that sodium iodide could be used to force precipitation through “cloud seeding”. If you ask for Vonnegut in a library, you’ll probably be directed toward the Science Fiction section, since many of his books are set in strangely twisted future worlds. However, his most famous and most widely acclaimed work draws on experiences during the Second World War, and in particular on the Allied fire-bombing of Dresden. Vonnegut had personally survived the attack by virtue of being held as prisoner of war in an underground meat locker, and the irony of this forms the title of the novel, Slaughterhouse-five.

15  Extract taken from “Why My Dog Is Not a Humanist” by Kurt Vonnegut, published in Humanist, Nov 92, Vol. 52:6.5-6.

16 “We cannot doubt existence without existing while we doubt…” So begins Descartes seventh proposition from his 76 “Principles of Human Knowledge” which forms Part 1 of Principia philosophiae (Principles of Philosophy) published in Latin in 1644 and reprinted in French in 1647 – ten years after his groundbreaking treatise Discourse on the Method in which “Je pense, donc je suis” (“I think, therefore I am”) had first appeared.

http://www.gutenberg.org/cache/epub/4391/pg4391.html

17 A more poetic version of Descartes’ proof had already been constructed centuries earlier by early Islamic scholar, Avicenna, who proposed a rather beautiful thought experiment in which we imagine ourselves falling or else suspended, and thus isolated and devoid of all sensory input including any sense of our own body. The “floating man”, Avicenna says, in spite of complete absence of any perceptions of a world beyond, would nevertheless possess self-awareness. That he can still say “I am” proves that he is self-aware and that the soul exists. In consequence, Avicenna also places soul above material, although no priority is granted to reason above our other forms of cognition.

18  Further extracts from Freeman Dyson ‘s acceptance speech for the award of the Templeton Prize, delivered on May 16, 2000 at the Washington National Cathedral.

19  Prospero in Shakespeare’s The Tempest, Act IV, Scene 1.

20 In 1996, the New York Times reported that: “Dennis C. Vacco, the Attorney General of New York, ordered the company to pull ads that said Roundup was ‘safer than table salt’ and ‘practically nontoxic’ to mammals, birds and fish. The company withdrew the spots, but also said that the phrase in question was permissible under E.P.A. guidelines.”

Extract taken from wikipedia with original reference retained. http://en.wikipedia.org/wiki/Monsanto#False_advertising

21 For further arguments against “Terminator Technology”, I recommend the following website: http://www.banterminator.org/content/view/full/233

22 From Freeman Dyson’s acceptance speech for the award of the Templeton Prize, delivered on May 16, 2000 at the Washington  National Cathedral.

23  From an article entitled “Virgin births discovered in wild snakes” written by Jeremy Coles, published by BBC nature on September 12, 2012. http://www.bbc.co.uk/nature/19555550

24  Also from Freeman Dyson’s acceptance speech for the award of the Templeton Prize.

25 http://www.greenpeace.org/seasia/ph/press/releases/GMOs-declared-unsafe-in-India-Greenpeace-calls-on-PH-to-follow-suit/

This original link has since been removed but the same article can be read here:

https://web.archive.org/web/20130607155209/http://www.greenpeace.org/seasia/ph/press/releases/GMOs-declared-unsafe-in-India Greenpeace-calls-on-PH-to-follow-suit/

Leave a comment

Filed under « finishing the rat race », GMO