Physics - Upper Echelon of the Scientific Priesthood

From the days of Bacon, the physicist occupied the apex of the scientific hierarchy because physics is, by definition, the purest form of logical inquiry into the nature of the material world. Reductionism, the doctrine that complex phenomena are explainable in terms of simpler ones went hand in hand with the idea that other fields of scientific inquiry were subsidiary to physics. An example of this deference to physics in the nineteenth century can be seen in the abandonment by geologists of their estimate of the age of the earth in the face of objections by Lord Kelvin, based on thermodynamic calculations he performed. Since no one could undertake to refute his logic or mathematics, they fixed the age of the Earth at a mere 100 million years, despite their own data which suggested the earth was much older. Today it is universally regarded to be thirty times that much (Lindley 28-31).

Occupying as they do the upper echelons of the scientific hierarchy, it is natural and usual for physicists to speculate on the ultimates of nature, and as Soyinka has noted, there is nothing more ultimate than the origin of the universe. In our science-oriented society, "cosmology" is a branch of physics, and cosmologists like Stephen Hawking are held in the highest esteem, much as Einstein was. The most widely accepted cosmological model today is the "Big Bang," the idea that the universe came into being in a catastrophic explosion, an explanation that on its face incorporates the Judeo-Christian myth of creation. Despite its male sexual imagery, a quick glance at this theory reveals some familiar conundra in slightly different forms. What came before the Big Bang? (i.e. What existed in the absence of all that we know?) Who, or what, caused it? The theory postulates that all the laws of physics were established in the split seconds after this event. If so, why were they established in the forms in which they were? Was the present universe, and only the present universe, potential in the pre-Big Bang? Most importantly, was human life potential in it? One could say that the fate of the theory rests on such philosophical and theological questions.

One of the obvious reasons why the scientific priesthood enjoys a high status is because, in the words of Davies, "science works." In the face of this status, it is easy to forget that theories about the origin of the universe are by their very nature speculative. Like all scientific theories, they are affected by the Zeitgeist , or spirit of the times, both contributing to and drawing from the prevailing cultural climate. David Lindley, an astrophysicist and Senior Editor of Science, reiterates that cosmological theories are tinged with the religious and aesthetic judgment of those who formulate them:

"In cosmology, as in particle physics, experimental or observational evidence is becoming scarce, and theory is increasingly removed from what can be measured or detected; cosmology may come to be ruled more by aesthetic preference or prejudice than by the traditional principles of science." (131)
Let us explicate the speculative nature of theories about the origin of the universe. To begin with, let us note that it is impossible to be sure what happened on this planet just six thousand years ago. This period in the Earth's history is forever lost to the world of sense experience and so retains an element of the mysterious. We also observe that the records of human development on this planet (what we call "history") are constantly subject to interpretation and revision and historians agree that their discipline involves as much fiction (in the form of imagination) as fact. Uncertainty attends even adult individuals trying to recall events of childhood. Would not the same principle apply to a theory concerning the origin of the universe, a tiny fraction of which is available to our sense experience in its present form? It must be impossible to know anything for sure about its history, let alone its origin. And yet the spirit of scientific and philosophical inquiry would not be satisfied if the universe were accepted as given without an attempt to know more about it, and this includes asking about its origin.

A Key Assumption
The last statement about the history of the universe needs some clarification. The empirical data we have about the sidereal world arrives in the form of light, which has a finite velocity. Therefore, when we are looking at distant objects, we are seeing them as they existed in the past. The further away the object, the further into the past we are looking. In effect, what we are observing with our telescopes is the history of the universe. There is a corollary to this, however, and that is (in the event the universe incorporates the extension of space we experience on this planet) that we have no idea what a region of space a billion light years away will look like a billion years from now, or, to put it another way, what a region of space a billion light years away looks like to a local observer - in effect, what it would look like to us if we were there now. Suppose there were "someone" fifteen billion light years away looking in the direction of our solar system with a powerful detecting device. They would not see our earth, because (according to the estimates of geophysicists) the earth hadn't been formed yet! Such a hypothetical observer would contact no intelligence, nor see any dinosaurs or even evidence of microscopic life - in fact, no sign of life at all, nor the oceans, nor even the molten fireball that eventually cooled to become the earth. In the same way, the distant objects we observe may not even exist in what we call the present. The "present" universe may be populated with planets identical to ours which we haven't seen yet, or perhaps the Andromeda galaxy has already exploded in a huge catastrophe that we have yet to learn about. In other words, our observations of the history of the universe tell us nothing definite about the universe as it would appear to our senses were we able to perceive all of it in our "present" .

The assumption that the universe is a unified whole and that all parts of it develop in the same way is just that, an assumption, albeit a powerful and desirable one. It allows us to conclude that we can deduce the present state of the universe, of which we cannot possibly have any direct experience, from what amounts to observation of past records, even though, however, we can't do the same on a planetary, species or individual level. Even more, on the power of this assumption, astrophysicists have extrapolated their data of the "present" universe to a distant past that, let's face it, is manifestly hypothetical.

Completeness of inquiry demands that we proceed, even speculativley, so let us make this assumption. Also let us also assume, with creationists, that the universe originated in a cosmic catastrophe at some finite time in the past. Allowing that time is a measure of the unfolding of the universe, this point would constitute a"singularity," a paradoxical case of existence before time. There is no conceptual problem with measuring time from this point, but as to what this point actually is, we are confronted again with the Aquinas conundrum. Nevertheless, let us assume its existence. This would have been a major show of fireworks, in fact, the largest conceivable. (Mythologist Alexander Eliot calls it "the mother of all shouts" (28)). Should there not be a cosmic record of it, that is, should we not be able to see it happening if we look far enough out into space (and back into the past)? Should we not then be able to demarcate visually, at least in principle, the limits of the observable universe? The beginning of the universe shouldn't be hard to spot, no matter how far away it is. Astrophysics posits that in all observed regions of space there is an ambient background temperature of about 3 degrees Absolute, and that this is the remnant of the Big Bang. If this is the case, however, should we not detect a temperature gradient as we look further out into space (and further into the past)? Shouldn't the universe heat up as we look farther out?

God and The Big Bang
The "Big Bang" theory uses scientific language to provide what is essentially a mythological cosmology, not straying too far from the creationist dogma of the Judeo-Christian society in which it has arisen, at the same time providing a satisfying matrix within which the laws of physics as we know them can exist. This latter need would have been filled equally well, however, by a competing cosmological explanation, the Steady State Theory of Hermann Bondi and Thomas Gold that was popular in the nineteen fifties. This cosmological formulation postulated no beginning nor end to the universe while at the same time allowing for the unidirectional passing of time which is an obvious feature of our universe. It required that small amounts of matter be spontaneously generated in all areas of the universe to compensate for the increase of entropy required by the Second Law of Thermodynamics. This "continuous creation" could be conceived in a way which did not violate the conservation of energy principle, and when Fred Hoyle performed the appropriate mathematical calculations, he discovered that the result fit beautifully into the Bondi-Gold model.

Davies tells us that this model did not fail on philosophic grounds - there is much appeal even in the Christian world for a universe with no beginning and no end - but was falsified by observation. Curiously, the two observations he cites as laying to rest this cosmological model are the two I just discussed. The Steady State theory predicted that the universe should be the same in all places and all epochs, but observations of deep space revealed the distant universe to be very different from our region. The second observation was that of the ambient radiation, which, as we noted, is believed to be the remnant of the Big Bang (57). I will admit to not understanding fully the Big Bang hypothesis, but in light of the above, I am tempted to question whether, had these two competing cosmologies arisen in a non Judeo-Christian context, these observations would have laid the Steady State Theory to rest so quickly.

As we noted, the issue of agency arises in regards to the Big Bang theory. One way of treating it has been proposed by Hawking and Hartle (Hawking 144). A Creator, or Creative Process, becomes an issue when it is thought that the universe came into being at a specific point in space-time. They argue, however, that there is a way of applying the Heisenberg Uncertainty Principle to the early universe, when it is imagined to be a super dense, super small particle, that abolishes this "singularity," or point in space-time when the universe came into being. In this formulation, one can say there was a time when the universe did not exist, but it is impossible to establish this time. Thus they provide, through an application of quantum mechanical principles, creationism without a specific act of creation, and the problem of agency is neatly side-stepped (Davies 68).

Before becoming convinced that the Hawking-Hartle formulation dispenses with the need for a Creator, however, it is good to remember, as Sheldrake points out (by quoting Hawking) that it is based on the reductionist assumption that everything can be explained by the physics of the smallest particles (126). Reductionism is something of a credo to physicists, as we have noted, allowing them to visualize the universe as composed of hierarchies of matter. In this way, the complex processes of life may be seen as reducing to the interactions of sub-atomic particles. It was the rationale for believing that all the processes of nature, including life, could be understood in terms of physical laws, from which arose the mechanistic conception of the universe that was the Zeitgeist of the nineteenth century. Leshan and Margenau have pointed out, by way of discrediting the idea of reductionism, that higher "levels" of complexity incorporate observables that are meaningless at a less complex level. They speak instead of "transcendence with continuity" (Ch. 7 & 8), a concept that lends itself well to description of the progression of matter to higher and higher levels of complexity that is a matter of empirical observation in the living world around us. It is possible to formulate evolutionary cosmologies that reduce upwards , characteristics of lower levels being explainable in terms of their efforts to synthesize into ever higher levels of organization. This is the idea behind Teilhard de Chardin's conception of "Omega," the Ultimate Locus of Attraction (Hymn 9). For Teilhard, evolution is a progression of complexity along a preferred axis, the outcome of which is pre-determined, to some extent, by the nature of Omega. The force of attraction is love (Phenomenon 264). Teilhard's is another mythology from the Judeo-Christian tradition whose existence is just as desirable to science (or should be) as that of the "Big Bang." It is needed to explain the existence and growth of life in a universe which, according to the explanations of physics, seeks nothing but thermodynamic equilibrium.

Hawking and Hartle have not dispensed with the question of agency, and Teilhard has shown the desirability of retaining and even expounding upon God in scientific circles, even if God does, allegorically, need to "wear new clothes" in this new context.

The Paradigm of Objective Observation - The Ghost in the Machine
Reductionism is a deterministic viewpoint which supposes the world to be analyzable and predictable in terms of simple, explainable phenomena. The reductionist attitude which prevailed in science from the time of Descartes until the adoption of quantum theory was definitively expressed by Pierre Laplace. He posited a demon who knew all the forces of nature, as well as all the data concerning some momentary state of the universe. Such a demon, said Laplace, would be able to calculate any other state of the universe at any other given moment. Laplace's demon, says Sheldrake, was nothing other than a superhuman scientist (90), or, in modern terms, an ordinary scientist in possession of a super computer. This demon was the quintessential "objective observer."

The dualistic doctrine of the objective observer, the "Cartesian partition," as Heisenberg calls it, resulted in conceptions of the reasoning mind being localized somewhere within the body - the "ghost in the machine." Descartes himself visualized it as a little man working the complex machinery of the body. More modern conceptions place this seat of consciousness somewhere in the brain. A few decades ago, this little man was visualized as "a telephone operator in the telephone exchange of the brain, and he saw projected images of the external world as if he were in a cinema" (Sheldrake 51). This looks a little comical today, but a version of "the ghost in the machine" continues to exist in the conception of the "seat of consciousness" which is supposed to exist in the region of the pineal gland. Many people speak of their "awareness" or "consciousness" as if it were a disembodied version of themselves able to travel through space "without the body." Modern research on brain impaired individuals seems to mitigate against strict localization of consciousness in the brain, however. In one instance related by Talbot, an individual with only a "minute rim of brain tissue" functioned in an apparently normal way, the defect only being discovered upon his death (87).

This discussion becomes more relevant to most people when they realize that the same idea persists in their conceptions of the soul, or of life after death. The conception of the soul as a non-material self with only temporary connections to the body has a long and complicated history within many cultural contexts, and one formulation of it is undoubtedly the observer in the Cartesian partition. Macy has said that had it been the process view of Heraclitus (pantha rei - everything flows) which took hold of later thinkers rather than the causal view of Parmenides (ex nihilo nihilo fit - nothing comes from nothing), the whole basis of western philosophy would have been different (10), which of course itself is an example of causal thinking. Be this as it may, the process view, after a two thousand year dormancy, has come into its own with the advent of quantum theory, and we can even fix the year when this occurred.

The Advent of Indeterminism
The year was 1927, when Heisenberg introduced the Uncertainty Principle. This principle arose while the equivalence of mass and energy was being established. According to one interpretation, at the microscopic level, where this equivalence is most important, matter/energy is vibrating, and anything which vibrates has associated with it a frequency/wavelength. If we want to determine the wavelength of an individual quantum, we must perform a measurement, which involves introduction of external energy in the form of light. Hence, our measurement will record the energy, not of the quantum, but of the quantum in interaction with the introduced light. The Uncertainty Principle places a limit on what we can know about the world in itself , at least as far as microscopic processes are concerned. It in effect states that the information extracted from an observation depends on the observation. This does not do away with dualism entirely, because an agent is needed to interact with the world in order to extract information, but it does antiquify the Cartesian "disembodied observer" of classical science. Heisenberg himself said:
"Natural science does not simply describe and explain nature; it is a part of the interplay between nature and ourselves; it describes nature as exposed to our method of questioning. This was a possibility of which Descartes could not have thought, but it makes the sharp separation between the world and the I impossible" (69)
The quantum universe is a process universe, in which there are no observers, only "participants," to use the terminology of physicist John Wheeler. In such a universe, the causal principle is replaced by one of interdependency. Buddhist philosophy has a term for this principle: paticca samuphadda , a Pali term which has been translated as "dependent co-arising" (Macy 18). The immediate post-Hiroshima world, into which the "baby boomers" (myself included) were born, was a world in which these principles were becoming a matter of experience.

"Quantum Participation" in the Post-Hiroshima World At the time of Einstein's death, c. 1957, many intellectuals were well on the way to developing post-Newtonian vision, but they had their feet still firmly planted on classical soil. Though their theories increasingly suggested the existence of a quantum universe of pure energy which did not exist independently of their observations, they still maintained the personae of disembodied observers. Leary, a neuro-psychologist, declared in the mid-seventies that it takes something more persuasive than vision to change a world view. It is necessary to rewire neural imprints:

"The only way to rewire neural patterns is to interfere with the neurotransmitter sequence at the synapse, thus retracting the old imprint and allowing for a new imprinting. Shock, illness, trauma, drugs, child delivery, stimulus deprivation and electrical charge are the only ways to change the chemistry of the synapse" (Info-Psychology 45).
In the late fifties/early sixties, he was a member of Harvard's psychology department, and this understanding may have been, at that time, nascent. It was demonstrated vividly, in his case, by subsequent events which we shall consider presently.

Widespread experimentation with neural imprints is a post-Hiroshima activity. The relevant theoretical framework was probably conceived about the same time, and in the same vein, as quantum theory, relativity, cybernetics and other far-reaching scientific ideas. For instance we can indicate presaging echoes in the teachings of G.I. Gurdjeiff, the pre-Hiroshima mystic teacher who died in 1949, who re-defined the work of 16th century Rosicrusians like Dr. Robert Fludd and Jacob Bohme with his conception of the Enneagram, a nine-sided diagram meant to depict the interplay of natural processes necessary for stability. Gurdjeiff's improvement included a recognition of the need for "intervals and shocks" (Bennett 47), a precursor of post-Hiroshima neurologics and the probable inspiration for the "engrams" of Scientology's L. Ron Hubbard. We note the allusions to quanta (intervals) and electricity (shocks).

The metaphor par excellance for the relativistic/quantum shift in scientific thinking is the atom bomb, and when it became a world reality, living manifestations of the Cartesian observer could be seen sitting at control panels in bunkers at Los Alamos or in the Nevada desert - anachronisms watching explosions from retreating bombers or plotting tactical nuclear strikes in the Pentagon as if they were "above it all," that is, unaffected by the destruction they were wreaking. They seemed to have the power of gods and to be able to unleash fury at their will. It was a clear case of classical thinking in a non-classical context. They were observers of, not participators in, the fury they were unleashing. The now old-fashioned scientific paradigm of "the ghost in the machine" became a military one (taking on a morbidly gruesome connotation), and it affected the entire United States during the period immediately succeeding World War II. This was, coincidentally, the era of the mass production of the television, that "cool" (in the McCluhanesque sense) medium that, quickly supplanted the "hot" media of radio, newspapers and newsreels as the preferred information source of the nation's households. The impact of television on American culture was considerable.

Television, the Participatory Medium A cool medium is one which invites interaction with the neural circuitry, and is essentially participatory, while a hot medium is not. For the first ten years of mass TV broadcasting, the paradigm of authoritarian distance that prevailed in the hot media continued to dominate the airwaves. Information and "news," often colored by jingoistic propaganda, passed unhindered through the retinae into the cerebral cortices of viewers. If it had an unpleasant taste, this could be tempered by information about or even a visit from a Hollywood "star" (the celestial analogy a clear indication of how wide the gap was between media producers and consumers). By far the greatest stars were the scientists who were making this technological revolution possible. They were beyond even television appearances, and TV had to make do with science "editors."

As early as 1950, however, some pioneers were discovering the interactive capabilities of television. One was the legendary Groucho Marx, himself a Hollywood icon, whose TV show "You Bet Your Life," one of the first to be successfully translated from radio, was a prototype for the game cum variety show, on which it was possible for "common, everyday" people to commune with the stars. They often encountered subtle hostility in the form of jocular but insulting repartee but as long as they were good sports about it, they were almost always rewarded, like laboratory rats, with merchandise or money. This displayed a certain arrogance in the media establishment that was evident, moreover, in the way television was used to foist consumer products on the public, products that were advertised, more often than not, as being "scientifically tested."

Objective, scientific distance in the classical sense was simply inappropriate in television, which functions like an extension of the visual/auditory centers of the brain, placing the viewer in mind, if not in body, in the midst of the action depicted on the screen. Moreover, by the repetitive superimposing of image upon image, as in commercials, it can teach, or imprint, at a subliminal level far more effectivley than print or radio simply because more attention is required to watch it. Television images are a non-linguistic language that, though often untranslatable into linear thought, communicate information far more effectively. Like all electronic media, its potential as a tool for neural imprinting is great, and because of the centralization of TV stations in the fifties, it was undoubtedly exploited as such, but probably without conscious intent. The same could be said of motion pictures, of course, but the key difference is the on-going "live" nature of television. With television the whole of North America became wired into a common neuro-circuit and people of different religious or cultural backgrounds shared the same TV images. This circuit soon expanded to include Europe, and eventually, through satellite, the whole world. This process, aided by advances in telephone technology has made it possible not only to know about, but also to interact with events worldwide. For instance, one can, at least in theory and increasingly in fact, watch a live report or television show, dial a number on the telephone and actually, in a small or large way, affect the outcome of events.

We live in today what was called by McCluhan the "global village." With television, the world has become increasingly participatory, but this capability of television was not fully realized, or even suspected, in its early days. Beyond this, however, was the fact that the full democratic participation in the government that television made possible was actively resisted by authoritarian elements. This last statement deserves elaboration, because it is important in the historical chronologies to follow.

Scientio-Political Authoritarianism We have described the origins of the scientific priesthood, that male-dominated society that has overseen the development of western scientific thought since the time of Bacon, and we have alluded to the fact that it was incorporated into the military establishment either before or during World War II. The inner machinations of religious hierarchies are quite often secret, and shrouded in religio-mystical language in order to maintain their power. Once one deciphers or learns the language, the chief preoccupation of the initiate, one has political access to such organizations. Many baby boomers, such as myself, impressed by scientific ideas, influenced by the North American fascination with science and just inquisitive, became initiates.

One could say analogically that the scientific hierarchy hid itself in the "ivory tower" of the scientio-mystical language of mathematics, and we have maintained that the scientific mystique was borrowed by the military-industrial complex. Some of the first post-Hiroshima generation, which was in general highly educated, began to rebel against the authoritarian distance of what was in effect a government dominated by the military. They started demanding more immediate participation. From the point of view of those running the government, it became necessary to implement more extreme means of maintaining control in order to prevent a perceived proliferation of anarchical influences from eroding their power base. This resulted in a tightening of control concerning government activities vital to "national security." The National Security Council, the CIA, and other covert operations which bloomed within the United States government were antithetical to the concept of participatory democracy and gained power partly out of religio-mystical authoritarianism that was a throwback to the pre-participatory era before television. The fact that Freemasons founded the US. government and maintain it on several levels may be another possible cause for this secretive attitude (this is discussed in more detail below), but secrecy was not unique to the American government. The government of the Soviet Union, supremely authoritarian and deeply rooted in the Greek Orthodox Christian tradition (which was heavily suppressed) by all accounts far surpassed the Americans. The culmination of this trend, the polar apogee, was the US. administration of 1980-88, headed by a former Hollywood "star," hugely popular, who sanctioned a return to fundamentalist Christian teachings and who publicly denounced the Soviet Union as an "evil empire." This administration was widely accused of carrying on covert operations that violated the constitution, but it was also the administration during which the Cold war ended.

continue... Bibliography