<a href="http://www.geocities.com/vaksam/">Sam Vaknin's Psychology, Philosophy, Economics and Foreign Affairs Web Sites
Occasionalism is a variation upon Cartesian metaphysics. The latter is the most notorious case of dualism (mind and body, for instance). The mind is a "mental substance". The body - a "material substance".
What permits the complex interactions which happen between these two disparate "substances"? The "unextended mind" and the "extended body" surely cannot interact without a mediating agency, God. The appearance is that of direct interaction but this is an illusion maintained by Him. He moves the body when the mind is willing and places ideas in the mind when the body comes across other bodies. Descartes postulated that the mind is an active, unextended, thought while the body is a passive, unthinking extension. The First Substance and the Second Substance combine to form the Third Substance, Man. God - the Fourth, uncreated Substance - facilitates the direct interaction among the two within the third.
Foucher raised the question: how can God - a mental substance - interact with a material substance, the body. The answer offered was that God created the body (probably so that He will be able to interact with it). Leibnitz carried this further: his Monads, the units of reality, do not really react and interact. They just seem to be doing so because God created them with a pre-established harmony. The constant divine mediation was, thus, reduced to a one-time act of creation. This was considered to be both a logical result of occasionalism and its refutation by a reductio ad absurdum argument.
But, was the fourth substance necessary at all? Could not an explanation to all the known facts be provided without it? The ratio between the number of known facts (the outcomes of observations) and the number of theory elements and entities employed in order to explain them - is the parsimony ratio. Every newly discovered fact either reinforces the existing worldview - or forces the introduction of a new one, through a "crisis" or a "revolution" (a "paradigm shift" in Kuhn's abandoned phrase). The new worldview need not necessarily be more parsimonious. It could be that a single new fact precipitates the introduction of a dozen new theoretical entities, axioms and functions (curves between data points). The very delineation of the field of study serves to limit the number of facts, which could exercise such an influence upon the existing worldview and still be considered pertinent. Parsimony is achieved, therefore, also by affixing the boundaries of the intellectual arena and / or by declaring quantitative or qualitative limits of relevance and negligibility.
The world is thus simplified through idealization. Yet, if this is carried too far, the whole edifice collapses. It is a fine balance that should be maintained between the relevant and the irrelevant, what matters and what could be neglected, the comprehensiveness of the explanation and the partiality of the pre-defined limitations on the field of research.
This does not address the more basic issue of why do we prefer simplicity to complexity. This preference runs through history: Aristotle, William of Ockham, Newton, Pascal - all praised parsimony and embraced it as a guiding principle of work scientific.
Biologically and spiritually, we are inclined to prefer things needed to things not needed. Moreover, we prefer things needed to admixtures of things needed and not needed. This is so, because things needed are needed, encourage survival and enhance its chances. Survival is also assisted by the construction of economic theories.
We all engage in theory building as a mundane routine. A tiger beheld means danger - is one such theory. Theories which incorporated less assumptions were quicker to process and enhanced the chances of survival. In the aforementioned feline example, the virtue of the theory and its efficacy lie in its simplicity (one observation, one prediction).
Had the theory been less parsimonious, it would have entailed a longer time to process and this would have rendered the prediction wholly unnecessary. The tiger would have prevailed. Thus, humans are Parsimony Machines (an Ockham Machine): they select the shortest (and, thereby, most efficient) path to the production of true theorems, given a set of facts (observations) and a set of theories. Another way to describe the activity of Ockham Machines: they produce the maximal number of true theorems in any given period of time, given a set of facts and a set of theories. Poincare, the French mathematician and philosopher, thought that Nature itself, this metaphysical entity which encompasses all, is parsimonious.
He believed that mathematical simplicity must be a sign of truth. A simple Nature would, indeed, appear this way (mathematically simple) despite the filters of theory and language. The "sufficient reason" (why the world exists rather than not exist) should then be transformed to read: "because it is the simplest of all possible worlds". That is to say: the world exists and THIS world exists (rather than another) because it is the most parsimonious - not the best, as Leibnitz put it - of all possible worlds.
Parsimony is a necessary (though not sufficient) condition for a theory to be labelled "scientific".
But a scientific theory is neither a necessary nor a sufficient condition to parsimony. In other words: parsimony is possible within and can be applied to a non-scientific framework and parsimony cannot be guaranteed by the fact that a theory is scientific (it could be scientific and not parsimonious). Parsimony is an extra-theoretical tool. Theories are under-determined by data. An infinite number of theories fits any finite number of data. This happens because of the gap between the infinite number of cases dealt with by the theory (the application set) and the finiteness of the data set, which is a subset of the application set.
Parsimony is a rule of thumb. It allows us to concentrate our efforts on those theories most likely to succeed. Ultimately, it allows us to select THE theory that will constitute the prevailing worldview, until it is upset by new data.
Another question arises which was not hitherto addressed : how do we know that we are implementing some mode of parsimony? In other words, which are the FORMAL requirements of parsimony?
The following conditions must be satisfied by any law or method of selection before it can be labelled "parsimonious":
Exploration of a higher level of causality - the law must lead to a level of causality, which will include the previous one and other, hitherto apparently unrelated phenomena. It must lead to a cause, a reason which will account for the set of data previously accounted for by another cause or reason AND for additional data. William of Ockham was, after all a Franciscan monk and constantly in search for a Prima Causa.
The law should either lead to, or be part of, an integrative process. This means that as previous theories or models are rigorously and correctly combined, certain entities or theory elements should be made redundant. Only those, which we cannot dispense with, should be left incorporated in the new worldview.
The outcomes of any law of parsimony should be successfully subjected to scientific tests.
These results should correspond with observations and with predictions yielded by the worldviews fostered by the law of parsimony under scrutiny.
Laws of parsimony should be semantically correct. Their continuous application should bring about an evolution (or a punctuated evolution) of the very language used to convey the worldview, or at least of important language elements. The phrasing of the questions to be answered by the worldview should be influenced, as well. In extreme cases, a whole new language has to emerge, elaborated and formulated in accordance with the law of parsimony.
But, in most cases, there is just a replacement of a weaker language with a more powerful meta-language. Einstein's Special Theory of Relativity and Newtonian dynamics are a prime example of such an orderly lingual transition, which was the direct result of the courageous application of a law of parsimony.
Laws of parsimony should be totally subjected (actually, subsumed) by the laws of Logic and by the laws of Nature. They must not lead to, or entail, a contradiction, for instance, or a tautology. In physics, they must adhere to laws of causality or correlation and refrain from teleology.
Laws of parsimony must accommodate paradoxes.
Paradox Accommodation means that theories, theory elements, the language, a whole worldview will have to be adapted to avoid paradoxes. The goals of a theory or its domain, for instance, could be minimized to avoid paradoxes. But the mechanism of adaptation is complemented by a mechanism of adoption. A law of parsimony could lead to the inevitable adoption of a paradox. Both the horns of a dilemma are, then, adopted.
This, inevitably, leads to a crisis whose resolution is obtained through the introduction of a new worldview. New assumptions are parsimoniously adopted and the paradox disappears.
Paradox accommodation is an important hallmark of a true law of parsimony in operation. Paradox Intolerance is another. Laws of parsimony give theories and worldviews a "licence" to ignore paradoxes, which lie outside the domain covered by the parsimonious set of data and rules. It is normal to have a conflict between the non-parsimonious sets and the parsimonious one.
Paradoxes are the results of these conflicts and the most potent weapons of the non-parsimonious sets. But the law of parsimony, to deserve it name, should tell us clearly and unequivocally, when to adopt a paradox and when to exclude it. To be able to achieve this formidable task, every law of parsimony comes equipped with a metaphysical interpretation whose aim it is to plausibly keep nagging paradoxes and questions at a distance. The interpretation puts the results of the formalism in the context of a meaningful universe and provides a sense of direction, causality, order and even "intent". The Copenhagen interpretation of Quantum Mechanics is an important member of this species.
The law of parsimony must apply both to the theory entities AND to observable results, both part of a coherent, internally and externally consistent, logical (in short : scientific) theory.
It is divergent-convergent : it diverges from strict correspondence to reality while theorizing, only to converge with it when testing the predictions yielded by the theory. Quarks may or may not exist - but their effects do, and these effects are observable.
A law of parsimony has to be invariant under all transformations and permutations of the theory entities. It is almost tempting to say that it should demand symmetry - had this not been merely an aesthetic requirement and often violated.
The law of parsimony should aspire to a minimization of the number of postulates, axioms, curves between data points, theory entities, etc.
This is the principle of the maximization of uncertainty. The more uncertainty introduced by NOT postulating explicitly - the more powerful and rigorous the theory / worldview. A theory with one assumption and one theoretical entity - renders a lot of the world an uncertain place. The uncertainty is expelled by using the theory and its rules and applying them to observational data or to other theoretical constructs and entities.
The Grand Unified Theories of physics want to get rid of four disparate powers and to gain one instead.
A sense of beauty, of aesthetic superiority, of acceptability and of simplicity should be the by-products of the application of a law of parsimony. These sensations have been often been cited, by practitioners of science, as influential factors in weighing in favour of a particular theory.
Laws of parsimony entail the arbitrary selection of facts, observations and experimental results to be related to and included in the parsimonious set. This is the parsimonious selection process and it is closely tied with the concepts of negligibility and with the methodology of idealization and reduction. The process of parsimonious selection is very much like a strategy in a game in which both the number of players and the rules of the game are finite.
The entry of a new player (an observation, the result of an experiment) sometimes transforms the game and, at other times, creates a whole new game. All the players are then moved into the new game, positioned there and subjected to its new rules. This, of course, can lead to an infinite regression. To effect a parsimonious selection, a theory must be available whose rules will dictate the selection.
But such a theory must also be subordinated to a law of parsimony (which means that it has to parsimoniously select its own facts, etc.). a meta-theory must, therefore, exist, which will inform the lower-level theory how to implement its own parsimonious selection and so on and so forth, ad infinitum.
A law of parsimony falsifies everything that does not adhere to its tenets.
Superfluous entities are not only unnecessary - they are, in all likelihood, false. Theories, which were not subjected to the tests of parsimony are, probably, not only non-rigorous but also positively false.
A law of parsimony must apply the principle of redundant identity. Two facets, two aspects, two dimensions of the same thing - must be construed as one and devoid of an autonomous standing, not as separate and independent.
The laws of parsimony are "back determined" and, consequently, enforce "back determination" on all the theories and worldviews to which they apply.
For any given data set and set of rules, a number of parsimony sets can be postulated. To decide between them, additional facts are needed. These will be discovered in the future and, thus, the future "back determines" the right parsimony set. Either there is a finite parsimony group from which all the temporary groups are derived - or no such group exists and an infinity of parsimony sets is possible, the results of an infinity of data sets.
This, of course, is thinly veiled pluralism. In the former alternative, the number of facts / observations / experiments that are required in order to determine the right parsimony set is finite. But, there is a third possibility: that there is an eternal, single parsimony set and all our current parsimony sets are its asymptotic approximations. This is monism in disguise. Also, there seems to be an inherent (though solely intuitive) conflict between parsimony and infinity.
A law of parsimony must seen to be at conflict with the principle of multiplicity of substitutes. This is the result of an empirical and pragmatic observation : The removal of one theory element or entity from a theory - precipitates its substitution by two or more theory elements or entities (if the preservation of the theory is sought). It is this principle that is the driving force behind scientific crises and revolutions. Entities do multiply and Ockham's Razor is rarely used until it is too late and the theory has to be replaced in its entirety. This is a psychological and social phenomenon, not an inevitable feature of scientific progress.
Worldviews collapse under the mere weight of their substituting, multiplying elements. Ptolmey's cosmology fell prey to the Copernican model not because it was more efficient, but because it contained less theory elements, axioms, equations. A law of parsimony must warn against such behaviour and restrain it or, finally, provide the ailing theory with a coup de grace.
A law of parsimony must allow for full convertibility of the phenomenal to the nuomenal and of the universal to the particular. Put more simply: no law of parsimony can allow a distinction between our data and the "real" world to be upheld.
Nor can it tolerate the postulation of Platonic "Forms" and "Ideas" which are not entirely reflected in the particular.
A law of parsimony implies necessity. To assume that the world is contingent is to postulate the existence of yet another entity upon which the world is dependent for its existence. It is to theorize on yet another principle of action. Contingency is the source of entity multiplication and goes against the grain of parsimony.
Of course, causality should not be confused with contingency. The former is deterministic - the latter the result of some kind of free will.
The explicit, stated, parsimony, the one formulated, formalized and analysed, is connected to an implicit, less evident sort and to latent parsimony. Implicit parsimony is the set of rules and assumptions about the world that are known as formal logic. The latent parsimony is the set of rules that allows for a (relatively) smooth transition to be effected between theories and worldviews in times of crisis.
Those are the rules of parsimony, which govern scientific revolutions. The rule stated in article (a) above is a latent one : that in order for the transition between old theories and new to be valid, it must also be a transition between a lower level of causality - and a higher one.
Efficient, workable, parsimony is either obstructed, or merely not achieved through the following venues of action:
Association - the formation of networks of ideas, which are linked by way of verbal, intuitive, or structural association, does not lead to more parsimonious results. Naturally, a syntactic, grammatical, structural, or other theoretical rule can be made evident by the results of this technique.
But to discern such a rule, the scientist must distance himself from the associative chains, to acquire a bird's eye view , or, on the contrary, to isolate, arbitrarily or not, a part of the chain for closer inspection. Association often leads to profusion and to embarrassment of riches. The same observations apply to other forms of chaining, flowing and networking.
Incorporation without integration (that is, without elimination of redundancies) leads to the formation of hybrid theories.
These cannot survive long. Incorporation is motivated by conflict between entities, postulates or theory elements. It is through incorporation that the protectors of the "old truth" hope to prevail. It is an interim stage between old and new.
The conflict blows up in the perpetrators' face and a new theory is invented. Incorporation is the sworn enemy of parsimony because it is politically motivated. It keeps everyone happy by not giving up anything and accumulating entities. This entity hoarding is poisonous and undoes the whole hyper-structure.
Contingency - see (r) above.
Strict monism or pluralism - see (o) above
Comprehensiveness prevents parsimony. To obtain a description of the world, which complies with a law of parsimony, one has to ignore and neglect many elements, facts and observations. Godel demonstrated the paradoxality inherent in a comprehensive formal logical system. To fully describe the world, however, one would need an infinite amount of assumptions, axioms, theoretical entities, elements, functions and variables. This is anathema to parsimony.
The previous excludes the reconcilement of parsimony and monovalent correspondence. An isomorphic mapping of the world to the worldview, a realistic rendering of the universe using theoretical entities and other language elements would hardly be expected to be parsimonious. Sticking to facts (without the employ of theory elements) would generate a pluralistic multiplication of entities. Realism is like using a machine language to run a supercomputer. The path of convergence (with the world) - convergence (with predictions yielded by the theory) leads to a proliferation of categories, each one populated by sparse specimen.
Species and genera abound. The worldview is marred by too many details, crowded by too many apparently unrelated observations.
Finally, if the field of research is wrongly - too narrowly - defined, this could be detrimental to the positing of meaningful questions and to the expectation of receiving meaningful replies to them (experimental outcomes). This lands us where we started : the psychophysical problem is, perhaps, too narrowly defined. Dominated by Physics, questions are biased or excluded altogether.
Perhaps a Fourth Substance IS the parsimonious answer, after all.