Dissertation Proposal

Emergence and Divine Action:
Exploring the Dispositional View of Causation
as a New Philosophical Foundation

My dissertation proposal was accepted at the GTU systematic and philosophical theology area meeting on Oct 15. All I have to do is to have it approved by the GTU doctoral council (sometime in November) and then write it. 🙂 Few words of explanation concerning my current research that I wrote in an email sent to a scholar that I am corresponding with online, will serve as a good introduction to the topic of my work.

The reason I got interested in the dispositional metaphysics is its rejection of Humean view of causation and re-connecting with Aristotelian metaphysics and philosophy of causation. But there is not an easy connection that one can establish between the two I’m afraid. Although some thinkers like Brian Ellis argue in favor of essentialism (see his Scientific Essentialism), they are not ready, nor willing to accept hylomorphism. The other problem is teleology. Molnar speaks about the natural “physical intentionality” of powers to manifest themselves, but hardcore Aristotelians are not satisfied. For them Aristotle’s distinction between active and passive potencies is crucial. They emphasize the character of the active potencies which are causal grounds of certain effects but without being determined to those effects by nature or without requiring any stimulus condition to obtain. (See for instance the paper by Errin Clark, which will be published soon in proceedings of the ACPA conference that took place a week ago in D.C and was dedicated to dispositional metaphysics) But this whole argumentation sounds like another criticism of conditional view of causation which is criticized by several dispositionalists – so they can defend themselves here. But the question remains: how Aristotelian is dispositional metaphysics???

Complex systems approach, emergence and systems theory are fascinating in terms of their re-discovery of complex structures and their holistic approach to reality. But they are stuck with the Humean view of causation which is based on his atomistic ontology of events and his dismissal of the ontology of objects. But one ontology cannot do without the other. Objects have properties (smell, age, physical construction) which cannot be ascribed to events. But acknowledging this requires from us a step beyond efficient causation which is the only one accepted in modern science. But scientists are very suspicious about making this move and buying into formal and final causes. They want to eat the cake and have it. That is, they argue in favor of irreducible complexity in systems theory, while saying – at the same time – that after all everything is explainable at the level of physical particles. They call themselves “non-reductionist physicalists” which I think brings a logical contradiction. If they are willing to buy into formal and final causes they claim – as Deacon does – that they emerge on the way of the growing complexity of the organization of matter, whereas for Aristotle these causes are simply out there all the time and ground all structures and processes not only bottom-up or top-down, but – as my advisor Michael Dodds OP says – inside-out.

My project would be to try to propose dispositional metaphysics as a philosophical base and ontology for Deacon’s emergentism and suggest that accepting a sort of essentialism (not necessarily hylomorphic essentialism) does not contradict science but opens it to philosophy of nature which can help to overcome the causal closure imposed by modern philosophy and science. In the second part of my dissertation I will work on the theory of divine action based on emergentism. I will show that dispositional metaphysics opens the way back to the Aristotelian-Thomistic view of causation and divine action, and God/world relation, which I want to propose as an alternative to the panentheistic theology of divine action based on emergence developed by Arthur Peacocke, Philip Clayton, and Niels Gregersen.
Advertisements

From Spinoza to Hegel

I’m publishing another paper that I wrote during the first two years of my studies at the GTU. It will eventually become an important part of one of the chapters of my dissertation.

 
Abstract: In recent years, the concept of panentheism has become one of the most influential  methodological frameworks among authors contributing to the science/theology debate in the Anglo-American context. However, a deep and well-weighed study of its philosophical foundations is still lacking. Moreover, a more critical evaluation of its legitimacy within theological reflection in the context of natural science is needed. The aim of this article is twofold. First, I present an analysis of a critical shift in metaphysics and the philosophy of God: I trace the origin of modern panentheism, the trajectory from Spinoza to Hegel, from substance to subject, from ontologically independent to an evolving God. Secondly, I refer to Barbour, Peacocke and Clayton and try to reveal crucial problems that challenge their versions of panentheism, as well as the one presented by Hegel. I claim that they all fail to express properly God’s transcendence. I argue from the position of classical theism.
View the paper in the PDF file (published on my profile on academia.edu)

Philosophy of Causation – Vol. 3 – Modernity

The struggle between more theoretical approach to scientific explanation in medieval Oxford and the empirical and causal approach of Aristotle promoted in Paris, prepared the ground for the big change which came with modernity. The advance of the new science brought a gradual dominance of reductionist empiricism and rejection of causal explanation in metaphysics. The third part of the first segment of my special comprehensive exam depicts the way in which this change happened, in reference to five philosophers from that period.

René Descartes (1596-1650)

51U5u9K+ABL._SY300_In order to understand methodology, ontology, and reflection on causality presented by Descartes, the father of the philosophy of classical science, we must remember that he was deeply rooted in Aristotelianism and scholasticism (deeper than it is usually acknowledged). He was particularly impressed with Aristotle’s demonstrative ideal of knowledge best exemplified in mathematics. His ambition was to apply this methodology to the physical sciences: “In physics I should consider that I knew nothing if I were able to explain only how things might be, without demonstrating that they could not be otherwise. For having reduced physics to mathematics, this is something possible …” (Oeuvres de Descartes). In order to make it happen he introduces a dualist ontology in which material substance is subject to mathematical, mechanical and deterministic description, and is distinguished from mind, which is unextended but capable of thought.

Descartes1However, because he was rooted in Aristotle, Descartes saw scientific knowledge being not only demonstrative, but also explanatory, and therefore referred to causal description (even if he understood explanation only mechanistically). Thus, he would put an emphasis on experimentation and value hypotheses, saying in his Discourse that he finds experiments “more necessary in proportion as our knowledge advances.” He was indeed both a keen observer on nature and careful experimenter. He saw hypothesis as: a) being equivalent with mechanical explanation, or b) offering possible explanations of phenomena, or c) suggesting the “true cause” of phenomena based on experimental evidence. But this aspect of Descartes’ method would never hinder nor question his rationalism. In the end he would regard experiments as merely confirmatory and not capable of refuting conclusions that he arrived at by rational insight. He says that: “while our experiences of things are often fallacious, deduction (…) can never be wrongly performed by an understanding that is in the least degree rational” (Rules for the Guidance of our Native Powers). To sum up, we can say that we find in Descartes a philosopher trying to relate his a priori deductions to the actual world which he experienced and in which he performed his experiments. He would find the latter, practical part of his methodology, providing data for analysis and an opportunity for mind to gain an intuition of clear and distinct ideas underlying the particulars it has experienced.

Descartes on StampConcerning philosophy of causation, while Aristotelian ideal of strict demonstration through causes permeates his work, Descartes rejects Philosopher’s fourfold notion of causality. Replacing the complex scholastic system of substantial forms, qualities of various kinds, elements, and other particles, with the clear and distinct idea that bodies are composed of particles in motion, he banishes formal and final causality (the latter he saw as being beyond human understanding). Though he still subscribes to material causality, he reduces it to basic physical constituents (rejecting “primary matter”), and eventually finds himself left with just one – efficient cause. Because it bears the burden of all other types of causation, Descartes develops a more complex theory of efficient causality. He distinguishes between the efficient cause of being and existence of all created things, which he attributes to God (efficiens and totalic causa), and efficient causes of various phenomena of the physical world (they have more explanatory power from the scientific point of view). His view of causation becomes deterministic. He sees the determinism of natural laws built into the efficacy of mechanical motion. Ends are predetermined by God and are opaque to human understanding, but through the laws that God predetermined “as the efficient cause of all things,” we can explain “the effects that we perceive by senses” (Principles of Philosophy).

Gottfried Wilhelm Leibniz (1646-1716)

Descartes won a great popularity among philosophers of his time. His method, including his causal reductionism, was followed by Hobbes, Gassendi, and Spinoza. However, there is one philosopher who would depart from this Cartesian tradition. Although Leibniz had followed in his early reflection the crowd of philosophers who valued mainly mechanical principles, he eventually changed his mind and become a defender of the plural notion of causality (which he defined in a manner that was peculiarly his own).

Gottfried Wilhelm von LeibnizLeibniz’s early fascination with mechanical philosophy finds its best expression in the paper written in 1677 (On a Method of Arriving at a True Analysis of Bodies and the Causes of Natural Things), in which he states that: “since we may perceive nothing accurately except magnitude, figure, motion, and perception itself, it follows that everything is to be explained through these four.” Here we can see Leibniz who was convinced that through experimentation and observation one can arrive at the mechanical explanation which supplies ultimate “true causes” of phenomena.

About eighteen years later, in Specimen Dynamicum, Leibniz is less optimistic about mechanical character of scientific explanation. Reflecting on the causes of motion, he proposes a new solution through the idea of force. What is crucial here is that he treats force as something prior to extension, present everywhere and implanted by the Author of nature. He distinguishes between active and passive aspects of force, and thus goes back to formal and material causality as defined by scholastics. Active force is of two kinds: a) a primitive active force which is nothing more but the “first entelechy,” and corresponds to the soul or substantial form; and b) a derivative active force which is exercised through limitation of primitive force resulting from conflicts between bodies. Passive force is divided into two kinds as well: a) a primitive passive force which constitutes the very thing and resembles scholastic idea of materia prima; and b) a derivative passive force which is found mainly in inertia and resistance of secondary matter.

Leibniz further develops his defense of plural notion of causality in his justification of metaphysics. He says that matter cannot stand by itself and that mechanism needs intelligence and spiritual substance, for it did not come from a material principle and mathematical reasons alone. Here he introduces a reflection on final causality. There is no doubt that for him God is the ultimate cause of everything in the universe. But he does not follow the occasionalism of Malebranche and acknowledges that God put into things certain properties which explain all their predicates. God’s eternal law is carried out by the activity of creatures.

hqdefaultThese developments of Leibniz’s metaphysics find another expression in his Monadology. The internal forces of monads can be identified with substantial form. When conceived as appetites, they have also a teleological character. The efficient causality of monads is not reciprocal. They cannot influence one another. That is why Leibniz introduces a rather vague idea of the principle of sufficient reason, which is supposed to explain the existence and relations between things (monads) apart from efficient causation (the meaning of both “sufficient” and “reason” is metaphysically unclear). The final cause of the order of monads and their characteristics is God. Although, according to his reflection in Monadology, efficient and final causality are complementary, Leibniz does not escape entirely the problem of determinism, which in his philosophy takes the form of a pre-established harmony.

David Hume (1711-1776)

Despite of Leibniz’s defense of formal and final causality, Descartes’ causal reductionism became more and more pervasive. It was followed by Locke, Newton, Berkeley, and Hume who is regarded by many as the one who gave the final blow to the theory of causation. While Berkeley argued against causal explanations (with reference to any kind of causation) in physical science, Hume generalized his opinion and applied it to natural philosophy and metaphysics, and thus did away with causality altogether. He claimed that given the concept of causal necessity, there is no way of justifying it rationally (causal necessity is not a logical necessity). He questioned three basic assumptions which he believed we accept in our notion of causation: 1) contiguity in space and time between cause and effect, 2) temporal priority of a cause to its effect, and 3) a necessary connection between cause and effect.

{D5079E92-F881-4CAD-98D4-BB94EE48752F}Img100In his Treatise of Human Nature Hume assumes that every idea in our mind is based on a prior impression. That is why in order to justify our idea of causation, we must find the impression that gave rise to it. The idea of necessity has its source in many instances of similar occurrences. It is an outcome of constant conjunction which produces an association of ideas in our mind. Hume emphasizes that the tie of necessary connection “lies in ourselves, and is nothing but the determination of the mind, which is acquired by custom,” and which we tend to project onto the world. Here he gives two important definitions of cause: 1) a cause is “an object precedent and contiguous to another, and where all the objects resembling the former are placed in like relations of precedency and contiguity to those objects, that resemble the latter;” 2) a cause is “an object precedent and contiguous to another, and so united with it, that the idea of the one determines the mind to form the idea of the other, and the impression of the one to form a more lively idea of the other” (A Treatise of Human Nature). Because causal relations are not logically necessary, they cannot be known a priori. In order to determine whether there is a causal relation between A and B we must rely on our experience of similar relations. Thus, constant conjunction or regularity is both necessary and sufficient requirement for causation. Especially if we assume that “we never can, by our utmost scrutiny, discover anything but one event following another” (An Enquiry Concerning Human Understanding).

But Hume’s metaphysical and methodological position, as well as his opinion about causation, are not as clear and transparent as it is usually assumed. First of all, his embrace of the radical empiricism is accompanied by some rational implications. While he states that the mind is concerned only with its own ideas and cannot perceive anything of the reality outside itself (radical empiricism), he adds at one point that all our distinct perceptions are distinct entities (strong realism). Considering Hume’s theory of causation, Don Garrett (“Hume” in The Oxford Handbook of Causation) points towards three possible interpretations of his position: 1) causal projectivism – a projection onto cause-and-effect pairs of an element of felt necessity that does not resemble anything in these pairs, and is derived from the experience of constant conjunction (causation is mind-dependent); 2) causal reductionism – causality is reducible to constant conjunction, but constant conjunction is not alone sufficient for knowledge of causal relations and we have to know, a posteriori, that constant conjunction is what the causal sense detects; 3) causal realism – it is possible to argue that Hume can accept causal cognitivism while rejecting the semantic reductionism that treats causal claims as synonymous with claims about constant conjunction. Garrett claims that Hume “concedes something to the motivations of each of these packages, and he could with some justice be classified as subscribing to any of them, or all of them, or none of them – depending on the details of the more specific definitions that might be proposed for them.”

Immanuel Kant (1724-1804)

bildKant found himself confronted with the rationalism of Leibniz and the empiricism of Hume, which he regarded as being in accord with Newtonian physics and philosophy, but incorporating to much of skepticism. Therefore, he was looking for a middle way which would implement both a priori elements of human knowledge stressed by rationalists, and the necessity of synthetic judgments based on the experience, emphasized in empirical circles. He sought a synthesis of these two approaches in a new type of judgment which is both synthetic and a priori. The price of this operation was significant. Since we can know a priori of things only what we ourselves put into them, synthetic a priori knowing “has to do only with appearances, and must leave the thing in itself as indeed real per se, but not known by us” (A Critique of Pure Reason).

Taking this position in his critical period Kant consequently and radically changes his way of philosophizing. While before Critique he was concerned with the problem as to how one substance acts on another, the problem of cause and effect, and the problem of space and extension as being constituted only by interaction of substances, in his present reflection substance as a basic explanatory principle is replaced by the a priori forms of sensibility and of understanding (he lists 12 of them). They are understood as the conditions of the possibility of experience, and are valid a priori for all objects of experience.

kants-thinking-capFollowing this methodological strategy Kant wanted to ground the principle of causality in the structure of reason. But in order to avoid the epistemologically disastrous consequences of the Humean criticism, he classified it as a principle explaining the synthetic a priori judgments of science. An event A is the cause of an event B iff there is a universal law which says that events of type A are necessarily followed by events of type B. But because neither the necessity nor the universality of the causal relation can be established empirically, it has to be grounded in the a priori conditions of judgment of a possible experience. Thus we can see that Kant rejects Hume’s view that we first perceive temporal succession between events, and only afterwards we are able to name one of them as cause, and the other as effect. For him the opposite is true. In order to establish an objective order of events in time, we need a priori synthetic concept of cause-effect relationships.

Summing up, we can say that Kant thought he have proved that concepts such as “cause” (“substance,” etc.) “stand a priori before all experience and have their undoubted objective rightness, though admittedly only in respect of experience” (Prolegomena to Any Future Metaphysics). For him every occurrence has a cause which is a prior event. The effect follows from the cause necessarily, and in accordance with a rule which is absolutely universal. All this is known to us a priori, but not without reference to experience.

Auguste Comte (1798-1859)

auguste_comte_grandeDespite of Kant’s struggle to save causation in epistemology and scientific explanation, the positive method in science and philosophy followed the way proposed by Berkeley and Hume, disposing of the search for causes. Positivists, in place of causes, searched only for correlations between facts which can be expressed in general laws of phenomena. Auguste Comte, who is one of the main representatives of the positivist camp, formulates his famous law of three stages: 1) theological or fictious, in which mind seeks the essential nature of beings, 2) metaphysical or abstract, in which mind replaces supernatural agents with abstract forces, and 3) scientific or positive, in which mind gives over the vain search for absolute notions and studies laws of natural phenomena (The Course in Positivist Philosophy). Although he endorsed the use of hypotheses, Comte accepted them only provided they were open to empirical verification. He stressed prediction rather than explanation as science’s primary aim. At one point he speaks very bluntly, leaving no doubt as to what he thinks about causal explanation: “The basic characteristic of the positive philosophy is that it regards all phenomena as subjected to invariable natural laws. Our business is – seeing how vain and senseless is any search into what are called causes, whether first or final – to pursue an accurate discovery of these laws, with a view to reducing them to the smallest possible number” (The Course).

In spite of his radical rejection of causality, Comte did not reduce his “positive philosophy” merely to induction. He still valued deduction for establishing particular conclusions related to special topics of research, and as an essential tool of making predictions. For this reason, we can venture to say that Comte in fact followed the path of those post-Cartesian philosophers who hoped to steer a middle course between empiricism and rationalism. Nevertheless, it seems that nothing was really able to change the course that philosophy of science took after Descartes, which led to the decline and final dismissal of causal explanation in phenomenalism, conventionalism, operationalism, and the orthodox interpretation of quantum mechanics. However, the story does not end up with these trends in philosophy. Causality is coming back on stage with anti-Humean tendencies and the return of realism, accompanied by some contemporary reflections on causality within the analytic tradition. But this is a material for yet another story.

16678583-abstract-word-cloud-for-scientific-realism-with-related-tags-and-terms

Hegel and Whitehead – part 1: Panentheism

I have just posted a new sub-page on my blog, which will contain abstracts and links to my official publications. I will try to explain the content of each one of them in several steps.

rtas20.v011.i04.coverMy first article in English was published in May 2013 in Theology and Science, a journal edited by the Center for Theology and the Natural Sciences in Berkeley. It’s title is quite sophisticated: Hegel and Whitehead: In Search for Sources of Contemporary Versions of Panentheism in the Science–Theology Dialogue. You can read an abstract here.

First of all, what is panentheism about? It all goes back to one of the main concerns of our reflection about God, which has been the source of struggle for theologians  over the centuries. On the one hand we acknowledge that God is totally different and unlike anything that we know by our sensual experience and intellectual reflection. God is totally and absolutely transcendent. He is omniscient (knows everything), omnipotent (can do everything), eternal, and above all – unchangeable. Therefore – according to St. Thomas Aquinas – while our relation to God is real and changes us, this same relation on the side of God is only a relation of thought (reason), because it cannot change God nor add anything to His essence which is pure act, without any hint of potentiality.

On the other hand we have to acknowledge that God is radically immanent, that is He is present in the world and all its aspects. Saint Thomas defines the very act of creation as a total dependence of every creature in its being on God. God is the source of the very existence of everything. Were he not present in contingent beings at every moment of their existence, they would perish at once. They actually exist because they participate in the infinite being of the Creator.

image-question14-largeAnd here comes the question: how does one bring God’s transcendence and immanence together into one model of His divine action? One of the possible answers goes back to Pseudo-Dionysius (5-6th century), and was further developed by St. Thomas Aquinas. It names three ways of our speech about God. The positive way enables us to formulate positive statements about the Creator. We may say for instance that God is good, and His goodness is revealed in creation (God’s immanence). But at the same time we must acknowledge that He is not good in the way that we are good. In other words, God’s goodness is unlike our goodness. This is the way of negation. Following it we realize that it is more appropriate for us to say what God is not, rather than what God is (God’s transcendence). And yet the connection with our human categories is not totally rejected. The third way – the way of eminence – saves it, claiming that God is good, but in an eminent way, which goes beyond any kind of goodness known to us. God is the source of all goodness, as His goodness is identical with His essence. This way of speaking about God is based on the doctrine of analogy, which I hope to explain in a separate entry.

This way of bringing together transcendence and immanence of God saves both of them and supports the classical Thomistic model of God- world relationship which I describe on the left side of the diagram below (click on the image to view it in a higher resolution).

UntitledAquinas’s view of God-world relation remained in a radical opposition to pantheism (the middle model on the diagram), which assumes that the world is God and God’s essence is exhausted in the world taken as a whole (an idea coming back today in the New Age and other “ecological” spiritual movements). However, commonly accepted and supported throughout the centuries, the classical model of Aquinas has been recently accused (beginning from the late 19th century) of overemphasizing God’s transcendence. God who does not have a real relationship towards His creation – says the main charge – is not the God of love. If creation cannot affect God, then He is not concerned with what is happening in the world. He is a God of philosophers, but not of the Bible.

As a remedy to this crisis, some theologians proposed a new model of God-world relationship – panentheism. It suggests that the world is in God (a link to pantheism), and yet God is more than the world (a link to classical theism of Aquinas). See the right-hand model on the diagram above. Proponents of this version of God-world relation suggest that because the world is in God, it has to affect God, therefore He is not unchangeable anymore, and His eternity is affected by time. Moreover, when creating the world God decides to limit his omniscience and omnipotence, in order to make a space for our freedom and contingent events in the world. He is not a detached ruler, but a fellow sufferer who understands. And yet – according to panentheism – God is still transcendent, because He is more than the world.

mothergoddessearthThe panentheistic model has become very popular, and found many applications in contemporary theology, especially in the circles of theology and science debate, where it seems to be suitable in explaining theological implications of contingency and indeterminacy of natural events. However, at the same time, it raises some basic and crucial questions. The first and the most important among them refers to God’s transcendence. If the world is in God and affects God, then it has to share God’s essence (God’s nature or substance), which challenges the truth about God’s transcendence. Moreover, if creation of the world changes God and limits some of the attributes which are substantial for His divinity (e.g. omnipotence, omniscience, eternity), it is hard to agree that He is still the God we believe in. If this is the case, then the claim of panentheism’s proponents, who say that it gives right to both God’s transcendence and immanence, simply does not hold.

Contemporary panentheism has many faces and versions. The truth is that it also has a long historical tradition, especially in the philosophical reflection on God and God-world relationsip. It’s roots go back to ancient Egypt and Greece. In my article I concentrate on two philosophical versions of panentheism: the one which was proposed by Georg Wilhelm Friedrich Hegel, and the other developed by Alfred North Whitehead. The former philosopher may be regarded as the father of the modern version of panentheism, while the latter has become very popular in the contemporary science-theology dialogue. I ask the question of the possible relation between their versions of panentheism and the nuances in their understanding of God’s transcendence and immanence.

That’s it for now. It is a prelude to the main body of the article which I hope to summarize in the next episodes under the same title: Hegel  & Whitehead.

Ian G. Barbour 1923-2013 in memoriam

 1081777

In Memory of Ian G. Barbour

On the 24th of December, at the age of 90, Ian Barbour passed away in Northfield, Minnesota. He was the first one to inspire a fruitful debate on science and religion, back in 1960s when the entire environment of the academia, both natural sciences and philosophy, were rather skeptical, if not hostile towards theology. In the citation nominating Barbour for the 1999 Templeton Prize, John B. Cobb rightly said that “No contemporary has made a more original, deep and lasting contribution toward the needed integration of scientific and religious knowledge and values than Ian Barbour. With respect to the breadth of topics and fields brought into this integration, Barbour has no equal.”

Education and Career

Barbour was born in Beijing, China. His mother was an American Episcopalian, and his father a Scottish Presbyterian. He grew up in China, the US, and England. After he received his PhD in physics from the University of Chicago in 1950, he studied theology. He got his B.Div. in 1956 from Yale University’s Divinity School, and taught at Carleton College beginning in 1955. He published his Issues in Science and Religion in 1966, Myths, Models and Paradigms in 1974. He became well known after his Gifford lectures from 1989 – 1991 at the University of Aberdeen. These lectures led to the book Religion in an Age of Science (reedited and republished in a more extended version as Religion and Science. Historical and Contemporary Issues in 1997), which is probably the best summary of his position. He was awarded the Templeton Prize in 1999 in recognition of his efforts to create a dialogue between the worlds of science and religion. Barbour was married to Deane Kern from 1947 until her death in 2011. They had four children.

Barbour’s Contribution to Science and Theology Dialogue

It is hard to overestimate the value of Barbour’s contribution to the science/theology debate. In fact, he may be regarded as its initiator. His project was extremely ambitious, difficult, and fragile – taking on account the fact that one of the partners of the debate was rather unconvinced that the whole conversation had any sense. Barbour begins with a preliminary attempt of specifying possible ways of relating science and religion (conflict, independence, dialogue, integration). He also tries to compare methodology of science and theology, and the role of scientific and theological models and paradigms. He shapes this conversation around the latest ideas developed in philosophy of science by Karl Popper, Thomas Kuhn, and Imre Lakatos. He strives to name similarities and differences between science and theology in order to prove that a dialogue is possible and might be fruitful. Setting up the stage, Barbour develops a historical overview of the science/theology relationship in the Western tradition, which then helps him to offer an analysis of contemporary issues, including: quantum theory, relativity, order and complexity, cosmology, design, chance and necessity, evolutionary theory, and hierarchy of levels in biology. As a conclusion of his project Barbour proposes a philosophical and theological reflection that suggests a major revision of the theological understanding of the nature of God and divine action.

I may disagree with many of Barbour’s propositions – which will become clear in the second part of this entry – but this does not change the fact that I remain deeply indebted to Ian Barbour and to what he did to promote the dialogue between theology and natural science. His example inspired a variety of scholars such as Arthur Peacocke, John Polkinghorne, Robert John Russell, Sallie McFague, Philip Clayton, and many other prominent representatives of the theology/science debate. Sitting in my room and writing this post, I am looking at my bookshelf knowing that the entire section on science and religion, including books published by Oxford University Press, Cambridge University Press, John Hopkins University Press, University of California Press, and Continuum, would have probably never come into existence if not Ian Barbour and his work. May he rest in peace of God, of whom I believe he now knows more than any one of us.

Critical Evaluation of Barbour’s Project

In terms of philosophy, already in his first book Issues in Science and Religion Barbour argues in favor of ‘critical realism’ which, as he suggests, should be applied in both science and theology. He claims that scientific truth should be assessed in terms of its: 1) agreement with data, 2) internal coherence, 3) applicability in relevant variables, and 4) possible applicability in future research programs. He concludes that in accordance with these rules, we have to agree that scientific truth is a subject of continual revision, that is, that our access to reality (realism) is never ideal and needs to be corrected (critical realism). When saying this Barbour wants to distance himself from so-called naïve realism (a conviction that our access to reality is always and fully accurate), instrumentalism (scientific truths are only instruments to predict and control the reality with no aspiration of revealing the truth about the world), and idealism (which would dismiss empirical science claiming that the truth is discovered in a mental analysis and description). In his definition of ‘critical realism’ Barbour is of course influenced by Thomas Kuhn’s The Structure of Scientific Revolutions. He suggests that this methodological tool can be applied in theology as well, opening a dialogue between the two disciplines.

Although I agree with the main thesis of ‘critical realism’ I am rather “critical” with regard to some important nuances of Barbour’s position. What is at stake, or rather in danger in ‘critical realism’ is ‘realism.’ There is a very thin line between the claim that science constantly needs to revise its truths and statements about reality, and a claim that we actually do not have an access to the reality as it is in itself. I have an impression that Barbour sometimes crosses this line. In Religion and Science he says explicitly at one point that “reality is inaccessible to us” (p. 110). When debating over the nature of models in science he emphasizes that they are merely “abstract symbol systems, which inadequately and selectively represent particular aspects of the world for specific purposes” (ibid., 117) They are just “imaginative human constructs.” I am pretty sure that it was not his intention to go this way, but it sounds like a cognitive skepticism. Moreover, I assume that Barbour, when speaking about ‘classical realism,’ refers to Aristotelian-Thomistic tradition, which is usually associated with this term and is accused of naïve and uncritical realism. A famous French mediavist Étienne Gilson answers to those accusations saying that “[Classical] realism does not reject the idea of a critique of the different kinds of knowledge. It accepts it; it calls for it. But it does reject all a priori critique of knowledge as such. Instead of prescribing limits to reason a priori, which soon become limits to reality itself, realism accepts reality in toto and measures our knowledge by the rule of reality. Nothing that is validly known would be so if its object did not first exist – to which we can add that there is nothing to prevent us from seeking to define, within this real order, the relations between the thinker and the thing thought about.” That is why Gilson suggests to replace ‘critical realism’ with ‘methodical realism,’ and famous Cartesian cogito ergo sum (I think, therefore I am) with res sunt, ergo cogito (things are, therefore I think). (Gilson, Methodical Realism. A Handbook for Beginning Realists, 87)

Another issue at stake in Barbour’s project is his use of ‘critical realism’ in theology. Comparing theology to science he says that theological data consist of religious experience, stories, and rituals. As a Catholic I cannot agree with such a proposition. It sounds too much like the liberal Protestantism of Schleiermacher. We cannot forget that the primary source of theology and religion is Revelation. Although we know it through ‘experience’ and we have to ‘interpret’ it, we still assume that the truth consists not only in our experience/interpretation, but in God’s intentional revealing Himself. We believe that with the help of the Holy Spirit/faith we can know God’s truth, independent of our experience, even if we know it as a way of darkness. Thus, Revelation has a kind of objectivity that is not present in scientific truths (or is different from the objectivity of science). It has its source in God’s authority which in Catholic Tradition finds its special expression in the teaching of the Magisterium. Barbour seems to value more tradition transmitted through stories and rituals, which he opposes to “abstract concepts and doctrinal beliefs” (Religion and Science, 113-14).

That is why, although I am all in favor of Barbour’s approval of Kuhn’s emphasis on historical and sociological aspects of science and his suggestion that it develops through scientific revolutions, rather than through a linear accumulation of data, I remain skeptical about his application of these ideas in theology. Naturally faith has both sociological and historical aspects, but it differs significantly form scientific knowledge. In case of scientific models we are making references to truths about nature that we may hope to come to know entirely, whereas in theology we are dealing with God who will always remain unknown (transcendent). What would be the way of verification/falsification of theological models? From the Catholic point of view we cannot relay merely on ‘religious experience,’ treated as an equivalent of empirical testing in science. For what would be the criteria of such an analysis? Does an experience of God’s withdrawal and of spiritual desert mean that I should change my religious paradigm? It also seems that many theological paradigms which are based on the same data (e.g. Platonic Augustianism and Aristotelian Thomism) are able to coexist, which is rather unlike in the case of scientific paradigms, which – according to Kuhn – are radically different, if not totally exclusive, and remain incommensurable when it comes to languages they use. Although this position has been softened by Lakatos’ proposition of an unchangeable ‘hard core’ of a scientific paradigm and peripheral auxiliary hypotheses which are subject to change, the problem of verification of theological truths still remains.

There are many other philosophical issues in Barbour’s project that need to be addressed, but that would require a more serious form of publication. I am pretty sure that next months will bring many possibilities of a fruitful debate on his work. Turning to theology I would like to mention only one issue, that is Barbour’s acceptance and development of the process theism of Whitehead, Hartshorne, Cobb and Griffin. It is very significant for somebody coming from the Thomist tradition, for subscribing to process theism Barbour supports and develops a criticism of the classical theology. He thus follows the entire movement in contemporary theology, suggesting a need of a substantial revision of the understanding of the nature of God and his divine action. The whole issue opens another debate, which I hope to comment and develop on my blog in future entries.

I will stop here leaving the door open for further conversation…

49827575_640

Ian G. Barbour 1923-2013

R.I.P.