versión On-line ISSN 2412-4265
Studia Hist. Ecc. vol.39 no.1 Pretoria may. 2013
Cornel W du Toit
Research Institute for Theology and Religion, University of South Africa, Pretoria, South Africa
The nascent theory of emergence is not only a plausible model for the course of natural and biological processes, but also of developments at an interpersonal and social level. In order to apply it to theology, I propose a non-causal approach to the discipline. In this article non-causal presupposes a non-linear, non-deterministic causality. Brief excerpts from the classical view of causality highlight the problems it entails. The quantification of reality following the rise of statistical science introduced all the elements that were to feature in the eventual theory of emergence: chance, probability, chaos, multiplicity (which nonetheless translated into regularity, and the notion of normativity associated with the mean and the dispersion of variables around it. The control principle is criticised, and preference is given to the concepts of freedom and spontaneity. The article concludes with some applications of a non-causal theology.
Introduction: important factors in thought about causality
"Where do things come from?" "Who made everything?" "Why are things what they are and not something else?" This type of questioning appears to be part of being human. The search for causality, therefore, seems intrinsically human. We know intuitively that for everything there is a reason and that events are effects and that these effects therefore must have causes - nothing happens for no reason (ex nihilo nihil fit). At this level, people tend to attribute different causes to events: when a group of soccer players are struck by lightning on the field it is ascribed (depending on where it happened) either to natural causes (the prevalence of lightning) or to an act of God, or maybe to supernatural forces, magic, sorcery. If lightning happens to strike an enemy who is hot on your heels, it will probably be regarded as a miracle.
For aeons people's worldview was governed by a reductive notion of causality. In a geocentric worldview, which assigns humans special status in the universe, it follows that everything that happens is directed to humans. Humans are seen as the acme of creation, and hence take pride of place in the creaturely realm, are the image of God.
This contrasts with the new cosmology and physics, in which humans are paltry beings in a vast universe, a random outcome of biological processes. Events are not planned for a purpose, are not designed to benefit or harm human beings: they just happen. Causal patterns that people discern and interpret as directed to their weal or woe are projections, arbitrary reconstructions of events.
Naturally, in such causal thinking, humans and their life world serve as a norm for causal factors. These intuitions make life predictable, hence manageable. If existence was not governed by laws, there would be no way of learning how to handle it, for each new day would be different and past experience would be worthless. Reality is seen as orderly, and humans have adapted to that order. This adaptation enabled them not only to survive, but to flourish. Yet, a cosmic order does not imply total predictability: from the outset humans were aware that some things might or might not happen. God was connected with order. Laws underlying reality were posited and ascribed to God's will. These laws ruled the cosmos, hence God was slotted in as the all-determining cause of events. Thus causality was linked with determinism.
Many other (intuitive) assumptions are products of the concept of causality, as are our (erroneous) answers to such questions. If two things consistently happen in succession, we assume a causal link between them. Cause and effect operate on the principle of post hoc, ergo propter hoc: if something follows something else, the former is the result of the latter. Yet we do not see the causal factor: we simply assume it and attribute it to some agency, be it human, natural or supernatural. Regularity is attributed to causality. Cause and effect are confined to a specific space and time; cause always precedes effect; there is a consistent connection between the two. The philosopher David Hume described most of these arguments and critical objections to them (in Du Toit 2006).
To humans it is counter-intuitive that things could happen without a cause, i.e. that God (the First Cause, from a philosophical point of view) does not control all that happens; in other words, that things happen by chance irrespective of our place in the order of creation; that my lot depends largely on sheer chance; that life or any other significant development can emerge spontaneously. By contrast faith is a closed system in which human life is infinitely significant and humans are secure in the framework of the will of a loving God, in whose big book their names and destinies are indelibly written.
These typical features of causal thinking have largely disappeared as a result of scientific revolution and developments, in mainly Europe, in the 18th century. The rise of statistics ushered in a numerically oriented worldview.
In the 19th century, the realisation dawned that the world, although marked by regularity, is not subject to universal rules. For the first time, the role of chance was taken into account. However,these new scientific interpretations of causality had no corresponding impact on theology. On the whole, religions relate(d) causality to divine intervention - an issue responsible for much of the antagonism in the science-religion debate. If God chooses, God can intervene and suspend the natural laws that God laid down. This is not provable, but is assumed in faith. Theologians criticised metaphysical systems, but disregarded scientific criticism of classical causal thinking. This was probably because the abolition of such thought would affect (metaphysical) God concepts too fundamentally, concepts holding that God is sovereign, omnipotent, responsible for all that transpires; and no event, good or bad, occurs without his foreknowledge or will.
This article briefly reviews the decline and transformation of causal thought, the philosophical concepts and epistemological models that replaced it, and the possibility of a non-causal theology. This is necessary because the insights that transformed causal thought have a decisive impact on present-day societies. Notions of the so-called "agency" in causal relations have changed greatly. So has the notion of linear causality, which assigns a single cause to consequences brought about by a far more complex network of factors. Instead of causality, the preferred explanatory model has become that of emergence.
Note that the concept of causality is not refuted, but a reductive kind of linear causality is rejected, because it does not allow for all the known and unknown factors that contribute to new turns of events. In addition to developments in the physical sciences, the rise of the science of statistics and evolutionary epistemology has revolutionised concepts of causality.1A clearly defined, non-causal theology is proposed and some of its implications are considered. Such a theology would focus on events that are not predestined, hence predisposed, by a predetermined teleology.
At the dawn of human history, agency was largely personalised. Animism saw the whole world as en-souled: there were forest spirits, water spirits, spirits associated with all sorts of (sacred) places. Human artefacts (especially weapons) were regarded as animate. Gradually natural causes came to be distinguished from supernatural ones. In nature, agency was determined by natural causes and natural laws.2 What was humanly inexplicable was ascribed to supernatural forces (ancestral spirits, gods, demons, etc). To the ancient Greeks Ananke (rendered as power, constraint, force, necessity)3personified fate. The gods, including gods of fate, had to be personal, since agency is associated with motives and motives cannot be impersonal. Hence, neutral forces, like natural forces, were also personified. However, fate (fatum) could also be impersonal. Aristotle's world soul was said to attribute a type of animation to all matter. Even after the emergence of science, the question about the animation of material things persisted. In Puritan England the Boyle-Hobbes controversy epitomised the grappling with the non-divine agency proposed by the physical sciences. Movement of molecular particles was an accepted principle. The big issue was whether they moved of their own accord or whether their every movement depended on God. The controversy was ground-breaking, since if particles could move by themselves, humans and institutions can also act autonomously without relying on God for every move (Du Toit 2007b: 116ff). I shall not dwell on the developmental history of causality, but merely take a look at the Occasionalists - from a Christian angle they brought the issue of divine agency to a head.
The idea of creatio ex nihilo probably prompted the Occasionalists to argue that, if God could create everything from nothing, God could cause things without any need to demonstrate a logical or causal link between cause and effect (Watson 1993:76ff). Occasionalism exemplifies the difficulty of establishing a physical or scientific connection between cause and effect. Whereas, in our worldview, interaction between body and mind and between objects seems natural, to 17th century thinkers it was mystical. To the Occasionalists no creaturely substance can affect anything else. All events were attributed to God. To quote one of their chief proponents, La Forge: "... by 'occasionalism' I mean a doctrine according to which God is actively, constantly, and ubiquitously engaged causally in the world. For an event or state of affairs x ... there is a discrete volition in God, a volition whose content is specific and that is causally responsible for the occurrence of that event or state of affairs: 'Let x occur at T'" (Nadler 1993:59). Occasionalism influenced Descartes inasmuch as it involved the mind-body problem. According to him, the two entities were radically disparate; therefore they could not affect each other unless God intervened. To the Occasionalists this also applied to the influence two extensive substances can exercise on each other (Nadler 1993:72). Descartes considered interaction between extensive substances to be mechanical, like the operation of cogs, whereas "minds cannot contact bodies mechanically, and so also bodies cannot press against minds to influence them" (Watson 1993:77).4
This is very evident in Malebranche's idea that the world of humans and things is just a puppet show in which objects behave as though they are interacting, whereas in fact they do not. All that can be claimed for causality is that B follows A. The cause as such cannot see. To postulate causal powers in things would be to revert to the Aristotelian world of spirits and occult powers (Watson 1993:84, 89).
Quantification of reality: an outcome of the scientific revolution
Science is the art of measurement: no measurement, no science. In an early article by TS Kuhn, "The function of measurement in modern physical science", he wrote the following: "The road from scientific law to scientific measurement can rarely be travelled in the reverse direction" (Bartholomew 2008:62). To discover quantitative regularity one must know what kind of regularity one is looking for and devise appropriate instruments to measure it.
But the quantification of the world was not an exclusively scientific achievement. It was rooted in pragmatic socio-political considerations. One of its major sources was Prussia in the reign of Frederick William I (1713-1740) and especially his son Frederick II (Frederick the Great, 1712-1786) and the public service reforms they initiated.
Since earliest times, population statistics were important, since one cannot prepare for war unless one knows how many able-bodied men one has at one's disposal. Especially after the Seven Years' War (17571763) under Frederick the Great, population statistics had to be updated. Virtually one third of the population had died in battle and there was a shortage of people to cultivate the land (Hacking 1990:22).
However, before long statistics were moralised. As early as 1710, John Arbuthnot claimed that divine providence regulated the balance between the sexes by ordaining the birth of more male infants to replace the men who were killed in war. Süssmilch, in turn, indicated that high mortality rates in cities related to the sinfulness of urbanites (Hacking 1990:21).
There is a strong correlation between statistics and control.
One cannot control (plan or prevent) anything unless one has accurate statistics. Wanting to count everything that can be counted obviously leads to absurdities, yet it is basic to most sciences. Between 1800 and 1850, quantification reached unprecedented heights. Hacking (2009:58-59, 62) lists the items that Babbage proposed for quantification (to the Royal Society, the Institute of France, and the Academy of Berlin). They included the constants of the solar system (distances of planets from the sun; revolution periods; forces of gravity exercised on the earth; weight of atoms; metals (gravitation, elasticity, conductive properties, etc.); optics; numbers of all known species (including fossils); height of all mammals plus their pulse beat, period of sucking, etc.; in regard to humans, their numbers, gender, lifespan, incidence of diseases among working classes; strength of men and animals, hence work capacity - the quantity of (whatever kind of) work that can be performed per hour. And so it carried on, enumerating, measuring and computing the vegetable kingdom, the atmosphere, materials, geography, buildings (heights of temples, pyramids, churches, steeples, columns, etc.).
Most research was based on probability.5 What is the probability that patients suffering from the same disease will die from one kind of surgery as opposed to an alternative kind (Hacking 2009:85)? There was unprecedented interest in crime, in measuring the incidence of different crimes and demonstrating all sorts of correlations. The incidence and distribution of suicide in particular was in the spotlight (Hacking 2009:64ff). In 1785, Condorcet applied probability theory to jury verdicts. Arago demonstrated that, in a jury decision with a majority of seven to four, the probability of an error is one in four; when the majority is eight to four one eighth will probably be wrong. Therefore, one out of every eight convicted criminals that are hanged is probably innocent. Consequently, Condorcet proposed "to proceed to moral mathematics, and compute the optimum jury system" (Hacking 1990:89).6 The point Hacking (1990:104) makes is that "moral science was replaced by moral analysis and then by quantitative sociology".
From chance to law and normality
In the course of the 19th century, advances in statistics led to growing and increasingly probing criticism of classical views of causality. The main target was the role of chance and probability.7 The past does not necessarily determine the present, at any rate not in the sense that it was formerly supposed.8
Vast multiplicity and complexity are usually associated with chaos. This is where randomness and chance enter into it. However, what appears to be chaotic at a micro level manifests order at macro level. Chaos in the sense of randomness apparently soon creates order: "... haphazard and seemingly unguided processes prove to have more pattern than we might have guessed." These patterns may emerge with statistical regularity (Bartholomew 2008:29), implying that order arises from chaos. "Chaos" represents the building blocks, a lower order on which the higher order of regularity builds.
What is fascinating is the startling novelty that comes out of randomness, which we would be incapable of inventing ourselves: "Randomness achieves easily that, which by design, might have been very difficult" (Bartholomew 2008:49, 53). The trouble arises when we try to link randomness with order or purpose. How can there be purpose at a higher level if it is based on lower levels of haphazardness? "Since randomness at one level implies order at another level, then, if one expects the order at this second level to express purpose, one cannot have it without randomness at the lower level" (Bartholomew 2008). The only answer is that, to a superior being, chaos is orderly and its outcome assured.9 However, that does not always apply, as in the case of radioactive decay, which is an example of pure chance. It is wholly unpredictable when the next emission from a radioactive source will occur. Hence Bartholomew (218-219) asks the following: "... if God is the cause of everything, how can he cause something which, by definition, appears to have no cause?" Here, chance becomes a factor in its own right. Maybe we can see it as metaphysical chance characterising aspects of reality which are not associated with any form of causality. It must be distinguished from chance created by the nature and the operation of autopoietic systems, which need no antecedent cause to function. As the name indicates, they function by themselves.
The role of chance in explaining things only became apparent as science progressed, especially the ability to measure infinitely large and infinitely small (quantum) worlds. Such measurements must be viewed in the very long time spans that could be involved. The measurement of miniscule and vast numbers gave rise to statistics (the science of calculation). It was realised that, if there are sufficiently many instances, they will reflect patterns - the law of large numbers, which has come to approximate the status of a metaphysical truth.
Statistical science has given rise to terms like probability. Concepts like chance, probability and the like did not materialise from thin air as useful, idealistic keys to reality. In 1844, Quetelet observed "that a great many human attributes have a graph, or distribution, just like that which had long been associated with coin tossing, and which had been elaborated for mathematicians as the 'curve of error'" (Hacking 1990:106). That marked the advent of the familiar bell-curve idea, which identifies the mean and the measure of dispersion around it.10
As in many similar instances, the idea of even dispersion can be used as a backdoor to smuggle in the will of God, namely that it is God's will that the statistical distribution is what it is. To this Bartholomew (2008:120) rightly responds as follows: "But this is to get things back to front. The near constancy of numbers from quarter to quarter is a consequence of an underlying statistical process, not an externally imposed law." The implication is that divine agency lurks behind chance and uses it to achieve its goal after all. However, what point does that have for human beings if it remains hidden behind statistical regularity? Or is the will of the divine agency confined to the curve that represents a supernormal positive?
By now the bell-curve is taken for granted to the extent that its implications are often overlooked. The moment the mean is established it tacitly becomes the norm to which people conform. The mean is considered normal and serves as a criterion for deviancy.11 If I know the mean weight/height/intelligence/lifespan of people in my environment, any deviation from it tends to be considered abnormal, subnormal or supernormal. Quantification of the world establishes a new criterion of what is normal and inadvertently affects all of us.
We must not overlook the tremendous influence of Newton and the advent of a mechanical, law-governed universe. Although statistics could indicate the accepted mean - hence that which is considered normal - with apparent regularity, there was a great deal of underlying indeterminism, since it was impossible to establish in advance who fell within the norm (mean) and who were excluded. It was also realised that this was attributable to further (unknown) rules that determined the dispersion. That begged the question whether or not these unknown factors were deterministic (Hacking 1990:147).
The point is that the nature and interpretation of statistics had an unexpectedly great influence on people's self-image. It led to all kinds of social distinctions, including gender, class, race, nationality (migrants), income, marital status, educational level, intelligence, incidence of diseases, crime, and aspects of human behaviour. There is nothing in society that cannot be measured. If a given percentage of people are obese, suffer from certain diseases, are alcoholics, commit certain crimes and one falls in that group, it provides a measure of self-justification: it is "normal" for people to belong to these categories and one happens to be one of them. Hence, normality and deviancy are assessed through measurement; it determines what is normal or supernormal, and it inevitably influences people. "On the one hand there is the thought that the normal is what is right, so that talk of the normal is a splendid way of preserving or returning to the status quo" (Hacking 1990:168). This is the background to Foucault's criticism of what he calls biopolitics. The question is whether normality is built into the structure of the world and social orders. Bartholomew (2008:39) comments: "The ubiquity of the normal distribution arises not so much because it is built into the structure of the world but rather that it is imposed onto the world by the way we choose to observe it." However, that does not mean we can conveniently sweep all statistics under the carpet.
However, because this view was associated with regularity, it inevitably created the impression that we were back in the paradigm of determinism. "Normality" and determinism, like chance and necessity, are kindred concepts. Even in processes that we consider wholly determined things may happen that display all the attributes of chance (Bartholomew 2008:58).
CS Peirce repudiated all determinism and believed in absolute chance. We might observe regularity in nature, but that does not mean that it is regulated or exact and uniform. He counter-acted with his hypothesis of chance and spontaneity, and how these crystallise and can be measured with mathematical exactitude (Hacking, 1990:203). He combines evolving laws with evolutionary epistemology, as follows: "The 'constants' are only chance variables that have settled down in the course of the evolution of laws" (Hacking 1990:214).
If large numbers manifest patterns, we can predict fairly accurately what will happen in society, including how many crimes and suicides will be committed. We merely do not know which individuals will be involved, because we cannot compute the variables properly. Quetelet wrote in 1832: "It is society that prepares the crime; the guilty person is only the instrument who executes it. The victim on the scaffold is in a certain way the expiatory victim of society. 'His crime is the fruit of the circumstances in which he finds himself' " (Hacking 1990:114). This smacks of statistical fatalism. Obviously, nobody is forced to commit suicide, yet statistics show that many were not free to refrain from it (see quotation from Buckle in Hacking 1990:124). Hacking's response to the charge of determinism was prompt: "People are not fated to follow statistical law, because the conditions of application of the law can be changed ... Discover what are the statistical laws that govern crime, disease, vice, unrest. Then find ways to alter the conditions under which those laws apply" (Hacking 1990:118).
The doctrine of God was widely viewed in terms of the emphasis on chance and uncertainty. Bartholomew (2008:65-66) mentions some theologians' attraction to chaos theory (in which chance, uncertainty and indeterminism feature), since it enabled them to account for God's acts in our world without transgressing any laws. "For if such different paths are followed by processes starting from positions which differ infinitesimally, then perhaps God can engineer changes of direction at the human level by such minuscule adjustments that they are not humanly detectable." Hence, the vast number of factors that we cannot detect create an illusion of chance and indeterminism. It is something like Adam Smith's "invisible hand" that controls the economy. By the same token God's invisible hand can control a multitude of factors, leading to outcomes that conform to God's will. Unlike Adam Smith's invisible hand, however, it is often difficult to discern God's invisible hand in an unjust world. However, it brings us back to metaphysical speculation: it is a metaphysical concept of God in my view accounting for most of the problems associated with the doctrine of God. Putting the accent on the human life world and the need for God's love, care and help rather than on metaphysical "why?" questions would make faith a great deal more plausible.
Design of a non-causal theology
Metaphysics groans under the tyranny of ontology - that immutable, final ground, the Unmoved Mover, encapsulated in the term "substantial ontology". A relational ontology stresses relations, but it is no longer ontological for, in contrast to ontology, relations are open and variable. Ontology in the absolute sense has always been important, because humans have anchored themselves and their lot in its primeval ground and ultimate promise. The yearning for ontological certainty often becomes an alibi for anchoring our own mortality in the infinite. It is a major shift to uproot ourselves from our metaphysically construed certainties and settle for an incomplete, open, relational existence. Establishing a doctrine of God in this realm means transposing God from eternity to the transience and unpredictability of this world, whose fleeting nature can accommodate the possibility of transcendent experience that fulfils our humanness. Ontology should make way for a quest for experience of life and that which elevates life - with all its chance, probability and chaos, its openness and unpredictability - to sublime meaning.
In such a case, how must we handle the view that chance is such a major factor? Despite their tacit awareness that many events are coincidental, most people prefer to replace chance with perceived patterns, which they then transpose to some agency that is considered responsible for everything. It is not that we do not know how natural laws operate and the confluence of circumstances that lead to specific experiences, but we choose to link events with determinism and intelligent causality.
In the final analysis, human beings gave birth to the notion of a First Cause and an Unmoved Mover! A plan for our life, the choice to believe that everything turns out for the good ... It seems as if we lack a capacity to live with chance and randomness. The sciences themselves are not free from this. Control is a hallmark of modernism, which still provides the framework for physical science. We study reality in order to control it, thus making our lives safer and more predictable.
The secret of evolutionary, economic and social success, however, probably lies in that which cannot be controlled and fully grasped.
In his The company of strangers: a natural history of economic life Paul Seabright (2004:9ff) details the complexity of a simple exercise like buying a shirt. Here are some excerpts from his argument:
This morning I went out and bought a shirt. There is nothing very unusual in that: across the world, perhaps twenty million people did the same. What is more remarkable is that I, like most of these twenty million, had not informed anybody in advance of what I was intending to do. Yet the shirt I bought, although a simple item by the standards of modern technology, represents a triumph of international cooperation. The cotton comes from India, grown from seeds developed in the United States; the artificial fibre in the thread comes from Portugal and the material in the dyes from at least six other countries; the collar linings come from Brazil, and the machinery for the weaving, cutting and sewing from Germany; the shirt itself was made up in Malaysia. The project of making a shirt and delivering it to me in Toulouse has been a long time in the planning, since well before the morning two winters ago when an Indian farmer first led a pair of ploughing bullocks across his land on the red plains outside Coimbatore. Engineers in Cologne and chemists in Birmingham were involved in the preparation many years ago. Most remarkably of all, given the obstacles it has had to surmount to be made at all and the large number of people who have been involved along the way, it is a very stylish and attractive shirt (for what little my judgment in these matters may be worth). I am extremely pleased at how the project has turned out.
To make their task even more challenging, they, or people very much like them, have been working at the same time to make shirts for all of the other twenty million people of widely different sizes, tastes and incomes, scattered over six continents, who decided independently of each other to buy shirts at the same time as I did. And those were just today's clients. Tomorrow there will be another twenty million - perhaps more. If there were any single person in overall charge of the task of supplying shirts to the world's population, the complexity of the challenge facing them would call to mind the predicament of a general fighting a war.
He proceeds to show how it would be all but impossible to appoint a team to manufacture such a shirt without all the individuals who did the spadework outlined in the foregoing paragraph. Yet, even though nobody is exercising specific, overarching control, shirts and millions of other items are readily available all over the world. "In fact there is nobody in charge. The entire vast enterprise of supplying shirts in thousands and thousands of styles to millions and millions of people takes place without any overall coordination at all" (Seabright 2004:10).
John Byl maintained that the principle of sufficient reason requires that everything that happens must have a cause. If there is no physical cause, some human or divine agency must be responsible for it. Hence a sovereign God is responsible for everything, more especially for events at quantum level (quoted in Bartholomew 2008:199). We have seen that randomness at a lower level gives rise to order at a higher level. At the higher level we can speak of purpose, but the two levels are inseparable. The same principle is involved in Aquinas's distinction between God as the primary cause and other, secondary causes. Secondary causes operate at a physical level and are the way the natural order functions. Here God's action cannot be proved, because it is secondary. God, the primary cause, determines such secondary causes (Watson 1993:84).
Locating God's actions at the microscopic levels of chance, randomness and indeterminacy has the advantage of leaving God in charge of everything without transgressing natural laws. Does it make sense to base a doctrine of God on such a notion? "The bizarre picture of God seated in front of a celestial control panel watching microscopic happenings throughout the universe and reacting to them almost instantaneously may be logically possible but it hardly fits with the notion of the loving heavenly Father of orthodox Christian belief' (Bartholomew 2008:152).
God moves and is moved by people
Causality implies control, determinism, even fatalism. It implies absolute power, abolishes freedom, and precludes personal relationships. Of course, that does not mean that God does not exercise influence, even a decisive influence. All interpersonal relations are marked by influencing. A relationship in which the will of one partner is imposed on the other is un-free. When it comes to God's involvement in physical and cosmological processes, we should rather keep quiet instead of requisitioning so-called "gaps" (eg quantum indeterminism) to accommodate God's acts. Besides, it would be impossible to trace the ascent from quantum level to the level of classical physics, human biology, consciousness and, ultimately, human decision-making. Does that entail acceptance of Gould's NOMA (non-overlapping magistrata) principle? Not exactly. Theologically we cannot pronounce on God's involvement in physical processes because, as noted already, our understanding of these leaves no scope for divine intervention. That might pose problems for the existence of a metaphysical God, but not that of a personal God. Theologians have already dismissed classical cosmological proofs of God's existence. The level at which people experience God is that of a personal relationship. We have to abandon a theology of power (causality), as the following excerpt from De Exupéry demonstrates.
In his The little prince (1991: section x) Antoine de Exupéry recounts how in the course of his travels the little prince comes to a planet occupied only by a very august king. When the little prince yawned at some point in their conversation the king commanded him not to yawn, since it was improper to do so in his presence. When the little prince explained that he couldn't help it, the king was obliged to command him to yawn in order to maintain his dignity. The king then admitted that he could not command one of his generals to change into a seagull, for the man could not do it through any fault of his own (implying that the ruler has to adapt to the natural order!). When the prince begged leave to ask a question, the king commanded him to do so. And so the story goes on. Eventually, when the little prince wanted to leave, he told his majesty that if he wanted implicit obedience, he would command him to leave within a minute or so.
In a way our interaction with God is similar. The little prince is a metaphor for science (and theology!) engaging in dialogue with God and, via the intrinsic necessity of the natural order, indirectly demanding that, in order to remain in control, God must command natural laws to be what they are. For God to retain his sovereignty and divine dignity he must constantly adapt his nature to science and people's logic, insight and worldviews. The God concept cannot be logically inconsistent. So, we postulate that God subordinates himself to the laws he himself imposed; that he designed his creation as an autopoietic system; that he is beyond space and time yet at the same time contains them; that he is somehow present in all physical and natural processes, for he has to control everything, hence nothing happens outside his absolute will and presence.
A non-causal theology puts God as a First Cause and an Unmoved Mover. Aristotle's unmoved mover culminated in "the One" of Plotinus's Enneads. It was taken over by Augustine, as Aristotle was taken over by Aquinas. The God of predestination accords with the closed Calvinist image of immovable omnipotence, the deistic God concept with a mechanistic Newtonian worldview. A mystical God is in harmony with Romanticism and subjectivism. The suffering God (Moltmann) stems from the helpless suffering of World War II. By the same token a God who emerges from the complexities that characterise life might accord with our contemporary understanding of evolution.
Theologically, strictly speaking, we cannot talk about God; but neither can we keep silent. We can, however, keep silent about issues like the exact way in which God is supposed to determine and control everything. The aim is not to make pronouncements about precisely what God is like or is not like.12 Indeed, if God is the Totally Other an apophatic theology is all that remains. The motive for saying at least something about God is to reformulate God concepts generated by a worldview that we have outgrown in light of our present-day understanding of ourselves and the world. That does not mean we have to kowtow to modern science - but certainly not to outdated images of bygone days.
The proposed non-causal theology would endeavour to shed the unnecessary metaphysical baggage that has characterised philosophy and theology since Greek antiquity. It would dispense with God's cosmic sovereignty, which conflicts with the scientific interpretation of reality. What was said in the past was either symbolic or a reflection of the worldview of those times. The causal God is a philosophic, metaphysical God. "The real problem, as we all know, is that God explains too much. God can do anything. And when God does everything, there is nothing to explain. Or, rather, the explanation of everything is this: God created the heavens and the earth. Nothing more is required" (Watson, 1993:83). God need not be the explanatory principle for creation, cosmology, and physical processes. Whether we "see" God's hand in them or not makes no difference to physical processes as we know them today.
A non-causal theology will distinguish between God and causally determined natural processes. It will abstain from literal interpretation of creation stories, which has become a major headache for proponents of the traditional Christian God concept. God is not forcibly infused into all matter willy-nilly, be it by way of pantheism, panentheism or absolutely sovereign theism.
One of the more plausible explanations of God's involvement with the world is process theology. I confine myself to the following quotations from Whitehead, which tally with the proposed non-causal theology.
Whitehead identifies three focal points that emerged from the early development of theism: God as ruler, as personification of moral energy and as a philosophic principle. He sees this as contrasting with what he calls the "Galilean origin of Christianity". "It does not emphasize the ruling Caesar, or the ruthless moralist, or the unmoved mover. It dwells upon the tender elements in the world, which slowly and in quietness operate by love; and it finds purpose in the present immediacy of a kingdom not of this world. Love neither rules, nor is it unmoved; also, it is oblivious as to morals. It does not look to the future; for it finds its own rewards in the immediate present" (excerpt from Process and reality, cited in Frankenberry 2008:183). Whitehead opposes the notion of God's omnipotence: "the presentation of God under the aspect of power awakens every modern instinct of critical reaction ... The Church gave unto God the attributes which belonged exclusively to Caesar" (Frankenberry 2008:180).
Whitehead conceives of God's nature as dipolar: a primordial and a consequential nature. His consequential nature is the history he incurs with us, which influences his primordial nature. "He does not create the world, he saves it: or, more accurately, he is the poet of the world, with tender patience leading it by his vision of truth, beauty, and goodness" (Frankenberry 2008:187).
To my mind this does not imply that God should be regarded as the cause of all natural processes that are actualised moment by moment (see also Frankenberry 2008:185).13 Such an approach remains at the causal-deterministic level. It is a vestige of our animist past which imbued all matter with vital force. God is not to be found in matter, in atoms, in energy transfer, or in entropic processes. That is part of the causal-control syndrome, the foundation of metaphysics.14 However, regardless of culture or era, God has always surfaced as the deepest question and deepest answer whenever humans are challenged by their circumstances and experiences. God is manifested in our response to the major challenges of life. That does not necessarily imply that events are reversible, sickness can be miraculously healed, the dead can come to life, the clock can be turned back, accomplished deeds can be undone. What it means is that God influences my way of coping with all eventualities.
A non-causal theology is not about a God who exercises no influence. Influencing induces change, and that needs to be emphasised. So, the point is the way God is manifested in events that affect people profoundly and how he influences them to cope with these. A non-causal theology has no ready-made blueprints or determinism; its hallmarks are freedom and an open encounter between God and human beings.15One could call it a consequential theology, a theology of the moment. After all, religion is the experience of God one has in moments of distress, fear, tribulation, worship, prayer, encounter. It is a confluence of many circumstances that prepared the moment when the transcendent God manifests himself (is experienced) in immanent reality. The accent is on awareness of God's presence in and via events. Humans know nothing about the hypothetical God who is supposedly active in quantum, cosmic or biological processes, nor are they influenced by him.
Dingemans (2001:105) cites Houtepen, who says that we must bid "God the First Cause" goodbye and look forward to the God of the future: God is an open question. As far as I know the implications, especially the cosmological and physical implications of doing away with the First Cause or Unmoved Mover, have however not yet been indicated. $$$
A causal God controls everything, whereas a non-causal approach proceeds on the premise that human beings are so profoundly moved by events experienced as encounters with the transcendent that they themselves make things happen, change their perceptions, change their their actions. The freedom implicit in a non-causal approach entails assuming responsibility and facing up to unhappiness (i.e. actions/thoughts that make us unhappy). In such a context we feel we are fighting on God's side against the evils, injustices and suffering in the world. After all, these things are not caused by God.
We have learnt that the world is not predetermined or controlled, because determinist causality at the level of multiplicity is unlikely. Maybe the Archimedes shift required in this instance is to see God as an outcome, not a cause. Once something has happened it is a fait accompli. We cannot alter it, but we can analyse, reconstruct, look for causes, level charges - and then get on with it and make the best of the situation. Being caught up in a labyrinth of "why me?" questions consumes a lot of energy and condemns us to lifelong repetition of bad experiences imprinted in memory, which does not mean that we must acquiesce in everything that comes our way and not take measures to prevent a recurrence of unpleasant events.
Thus a non-causal theology operates with the concept of emergence. "The term 'emergence' connotes the image of something coming out of hiding, coming into view for the first time - something without precedent and perhaps a bit surprising" (Deacon 2008:121). The causality at issue is again determined by lower-order processes, but not by any form of predetermined law. "These phenomena are often called self-organizing, because their regularities are not externally imposed but generated by iterative interaction processes, occurring in the media that comprise them" (Deacon 2008:123).
God is not simply the permissive cause (will) behind incomprehensible suffering, injustice, poverty, sickness and death: he bears the consequences alongside humans and is present during the process of recovery and resuming our lives. That surely accords with our interpretations of the words and deeds of the earthly Jesus. Most Gospel narratives are not about Jesus' account of the causes of events (see John 9, where Jesus' observation about the man who was born blind to some extent severs the chain of causality), but about his concern with the consequences: sickness, death, hunger, injustice, transgressions.
What makes better sense is the existential relational level where - in human circumstances, however dire or fortunate they may be - we have a relationship with a living God. God invariably manifests himself in moments of crisis. That is when his existence and interaction become real, including also in beauty and in moments when justice is done.
The accent is on a God who reacts rather than one who merely acts deterministically. Here he is one of many causes at work that influence circumstances; not the sole, all-determining (deterministic) cause. The Old Testament image of God travelling with his people presents a facet of God who looks with his people to see what the future holds and helps them contend with it. After all, it is pointless to travel with someone if you have mapped the journey in advance and already know what will happen. In fact, the only authentic way to understand the interaction between people is when the relationship is "open" and no outcome is predetermined. God happens. Yet events can never be imposed: we often have to wait for God, and the outcome of God's interaction with humans is often startlingly new. God changes in the process, just as people change. It is analogous with any relationship between equals in which shared experience influences the role players in all directions.
Deacon, Terrence, W. 2008. Emergence: the hole at the wheel's hub, in Clayton, P. & Davies, P. (eds.), There-emergence of emergence. The emergentist hypothesis from science to religion. Oxford: Oxford University Press, 111-150. [ Links ]
Dingemans, G.D.J. 2001. De stem van de roepende. Kampen: Kok. [ Links ]
Du Toit, C.W. 2006. Tout est bien? Natural and supernatural causes of evil. Perspectives from Hume's treatise and Voltaire's Candide. Scriptura 93(3), 315-329. [ Links ]
Du Toit, C.W. 2007(a). Limitations of the concept 'law of nature' as a source in science, philosophy, theology and law, in Drees, Willem B., Meisinger, H. & Smedes, TA. (eds.), Humanity, the world and God: understandings and actions. (Studies in Science and Theology, Volume 11, (2007-2008), 175-190, ESSSAT Lund. [ Links ])
Du Toit, C.W. 2007(b), Seasons in theology. Inroads of postmodernism, reference and representation, Pretoria: Research Institute for Theology and Religion, Unisa. [ Links ]
Eigen, M. & Winkler, R. 1983. Laws of the game. How the principles of nature govern chance. Hammerworth: Penguin. [ Links ]
Frankenberry, N.K. 2008. The faith of scientists in their own words. Princeton & Oxford: Princeton University Press. [ Links ]
Gregersen, N.H. 2008. Emergence: what is at stake for religious reflection? in Clayton, P. & Davies, P. (eds.), The re-emergence of emergence. The emergentist hypothesis from science to religion. Oxford: Oxford University Press, 279-302. [ Links ]
Hacking, I. 1990. The taming of chance. Cambridge: Cambridge University Press. [ Links ]
Kulstad, M.A. 1993. Causation and preestablished harmony, in Nadler, S, (ed.), Causation in early modern philosophy. Pennsylvania: Pennsylvania State University Press, 93-118. [ Links ]
Lewis, M. 1997. Altering fate: why the past does not predict the future, New York: Guilford Press. [ Links ]
Nadler, S. 1993. The occasionalism of Louis de la Forge, in Nadler, S. (ed.), Causation in early modern philosophy. Pennsylvania: Pennsylvania State University Press, 57-74. [ Links ]
Pearson, K. 2010. The scientific law, in Turner, S. (ed.), Causality I, 201-226. London: Sage. [ Links ]
Rutherford, Donald, P. 1993. Natures, laws and miracles, in Nadler, S. (ed.), Causation in early modern philosophy, pp. 135-158. Pennsylvanie: Pennsylvania State University Press. [ Links ]
Seabright, P. 2004. The company of strangers: a natural history of economic life. Princeton: Princeton University Press. [ Links ]
Van der Kooi, C. 2005. As in a mirror. John Calvin and Karl Barth on knowing God: a diptych. Leiden: Brill. [ Links ]
Watson, R.A. 1993. Malebranche, models, and causation, in Nadler, S. (ed.), Causation in early modern philosophy. Pennsylvania: Pennsylvania State University Press, 75-92. [ Links ]
Weber, O. 1972. Grundlagen der dogmatik, Vol. 1. Göttingen: Neukirchener. [ Links ]
1 Darwin (1809-1882) never dealt with or criticised the concept of causality, but he was open to the new science of statistics (Darwin knew nothing about mathematics).It is intriguing that many of the fundamental evolutionary concepts - chance, variability, measurement, etc. - were basic to the development of statistics. Darwin did draw on the work of his cousin Francis Galton, a pioneer in this field. In 1865, Galton demonstrated that statistical analysis of heredity supports the idea that moral and mental qualities can be inherited.
2 Pearson (2010:205-16) points out that the universality and absoluteness attributed to natural laws are relative to human thought. For a discussion of the concept of natural law, see Du Toit (2007).
3 Also see Moira for tragic fate and Pronoia for deterministic forces.
4 For lack of space I cannot deal with Leibniz's monadology, more specifically his notion of pre-established harmony and the role of causality in that regard. Monads operate autonomously through divine intervention and harmonise, not because of causal factors, but as a result of pre-established harmony. Unlike the Occasionalists, Leibniz assigns substances sufficient inherent power to determine their own states and changes (Kulstad, 1993:109ff; Rutherford, 1993:137).
5 A probability is a number purporting to measure uncertainty, commonly expressed as a number in the range 0-1, with the upper end of 1 denoting certainty and the lower end of 0 corresponding to impossibility (Bartholomew, 2008:69).
6 It was formulated as follows: "When the excess of the majority over the minority is d, and the reliability of an individual juror lies between 1/2 and 1, the probability that a jury is correct is 1/(3d + 1)" (Hacking, 1990:94).
7 The concept of probability (read in conjunction with chance) could have emerged as early as about 1660 (see Leibniz [1646-1716], The emergence of probability). Galton postulated the law of probability in 1889.
8 For a severance of the causal connection with the past (upbringing and background) in people's development and self-image, see Lewis (1997).
9 Here we have to do with epistemological chance, being instances where we lack causal factors, and ontological chance, being chance that characterises the nature and evolution of aspects of reality (Bartholomew, 2008:4).
10 The bell-curve varies with changes over time, yet the curve remains fairly constant. However, reality continually changes: "The more of reality we comprehend, the more aware we become of horizons that keep receding before us and are as unattainable as the peripheries of the expanding universe" (Eigen and Winkler, 183:187).
11 According to Hacking (1990:160) the word 'normal' acquired this meaning in the 1820s.
12 Watson (1993:81) observes: "The two main problems with God in philosophy are first, that resort to God explains too much, and second, that when we talk of God we do not know what we are talking about."
13 Gregerson refers to this as temporal theism (ibid).
14 It was also a feature of positivism: "Cause, in the canons of positivism, was a metaphysical notion. A good way to surpass metaphysics was to annul causation" (Hacking,1990:187).
15 The classical theological concept of concursus would be appropriate. It forms part of the doctrine of predestination, which distinguishes between conservatio, cuncursus and gubernatio (governing). Hence the notion of concursus is indissolubly linked with God's prescience (Weber, 1972:568:ff).