Mahasweta Chaudhury
(University of Calcutta)

Realism and Comparability

Some Cognitive Issues

The ontological status of objects of knowledge is crucial for epistemic theories. It seems that if these are mind-independent then the cognitive relativists do not have a strong ground. On the other hand if objects of knowledge are constructions or at least mind-dependent, then it is a win for them. Since scientific theory/knowledge enjoys a sacrosanct position, any effective assault on their objectivity, truth etc. can be regarded as a virtual victory for cognitive relativisrn.

I shall try to argue in this paper, by examining two different aspects. of cognitive relativism whether the arguments based on these aspects are adequate. These are methodological and anto1ogical aspects of cognitive issues. The first one involves the most powerful argument namely, the incommensurability of theories which can be contested. The second aspect concerns the ontological status of epistemic objects. It is claimed in this section that a1though we can concede a lot to the constructionist view, most of the arguments against objectivist theory of knowledge can be met by making a distinction between commonsense realism and scientific realism. Although scientific theories and objects involve lot of constructionist element, commonsense objects, surprisingly do not involve those problems and endure changes in cognitive systems in a more or less stable way.

I. The Methodological Aspect : Incommensurability of theories

The most powerful argument in favour of cognitive relativisrn is that the ground on which one theory is accepted rather than another is not rational or objective because rival theories are not comparable. Theory-choice is mostly based on historical-sociological grounds. Even 'propaganda, conceit and lie' are suggested (as done by Paul Feyerabend in the case of Galileo) as reasons for accepting a theory such as Copernicus' heliocentric theory in preference to Ptolemaic geocentric theory.

The argument in favour of cognitive relativism thus heavily relies on the assumption of incommensurability of theories as it is well known in literature on methodology. Let us start with the possible senses in which 'incommensurability' can be understood and then we shall see if theories are incommensurable. In this connection we can refer to various senses[1] without assuming that any of these is primary and the others secondary.

1) In one sense 'incommensurable' means 'not commensurable in respect of magnitude'.
2) In another it means 'incapable of being measured (in comparison with)'.
3) In mathematics it means 'having no common factors integral or fractional'.
4) Another use in mathematics refers to 'irrational - not commensurate with the natural numbers' . Examples may be non-terminating decimal etc.
Incommensurability of theories generally means (1) and (2) although the protagonists are not very precise about the meaning. (3) and (4) sometimes constitute the content of some version of the incommensurability thesis.

Let us start with an examination to see whether theories are comparable (or not) in any acceptable sense and also if cognitive relativism gains any ground by such analysis. A scientific theory is one which is always vulnerable to be falsified or at least superseded by a 'better' theory. What makes a theory 'better' and how theories are logically related to each other are methodological issues, issues undermined by a stance that regards rival theories as incommensurable or incomparable as to their cognitive/scientific value. Two theories are rivals in a strong sense if their intended subject-matter is identical and in particular if what either of them says about this common subject-matter is incompatible with what the other says. Such is the case with Einstein's theory of the gravitational field as compared with Newton's. Quantum mechanics and the classical mechanics of masspoints make up another pair of rival theories. In a weaker sense one theory is rival of another if the objects of the second one are not identical with, but ontologically dependent on the objects of the first. Thus we may see quantum mechanics as a rival of classical continuum mechanics and any atomic theory of matter (classical or quantum) as a rival of (phenomenological) thermodynamics. Because matter as the object of the latter is composed of atoms as the primary objects of the former. In these cases although the theories may be strictly speaking incompatible, their total relation may nevertheless be seen as that of one theory reducing the other or explaining it, and that is exactly what physical theories aim at : to understand and explain the properties of matter by those of its ultimate constituents.

The question is : Can rival theories be evaluated ? And if so on what basis ? This divides philosophers of science. One camp views theories to be evaluated by analysis of rival theories by objective and universal standards. In this opinion only logical parameters can decide which one is a better theory and succession of theories is progressive. On the other hand all these are denied by the opponent view that, so-called changes in meaning and conceptual incommensurabilities involved in the development of a discipline would defy all attempts to present the new theories as adding to our previous knowledge rather than being, as it was claimed, total replacements of the earlier views. In a wider sense the development of the natural sciences and of physics in particular was assimilated to the various other changes which depended essentially an extra-logical factors and were determined if not perverted by the idiosyncracies af their originators.

Unlike most philosophers of science, physicists have developed a more moderate view [2] on the matter. Instead of the philosophical view that our knowledge of nature undergoes a continuous development one can emphasise[3] that the development of theoretical physics had always been by leaps and bounds. This however must not mislead us to believe that the new ideas were absolutely true and the old had been totally worthless. We must rather acknowledge that the earlier theory too was useful and there is a possibility that the new theory in turn will be followed by an even better one.

Under the impact of relativity and quantum theory, Boltzmann's view has been developed into the standard version that an empirically successful though finally refuted theory S becomes a limiting case of its successor T. As viewed from T the theory S remains approximately valid under conditions a specifying the limit of validity and becoming known together with T. [4]

The standard parameters of comparing theories are (a) logic, (b) conceptual assimilation, (c) (deductive) consequence class and (d) empirical support.
(a) In comparing two physical theories we can assume a common logic although some assumptions of classical logic such as 'every sentence of a language is either true or false' are highly questionable for modern physics. Although same people believe that the quantum theory of a single object (as opposed to a statistical cluster) needs a deviant logic, advocates of such quantum logic could not yet reformulate the theory in such a way as to be founded only on quantum logic in exactly the same way in which classical physics can be founded on classical logic. Moreover the fundamental laws of logic are equally applicable to both classical and quantum mechanics.
(b) Rival theories often are different not only in their contention but even in their basic concepts. The concept of orbit for example is basic to classical particle mechanics but is absent in quantum mechanics. Similarly, as is well-known, the concept of absolute time and space is rejected by the principles of relativistic space-time in relativistic mechanics. Thus these pairs of theories seem to suffer from a certain conceptual incommensurability.
However conceptual incommensurability does not occur throughout and the mere existence of incommensurabilîty does not exclude the existence of a common conceptual basis for two theories, provided we allow for a sufficiently wide nation of conceptual equivalence or at least comparability. For example, Euclidean 'straight line' is comparable/equivalent to arc in non-Euclidean geometry which are rivals.
(c) Approximative reduction is again a ground for comparing two rival theories T and T1. Often a theory T does not only follow from Ti (that is a member of its consequence class) but often is the strongest consequence of Ti. oNewton's theory of gravitating masspoints as compared with Kepler's law is a case in question. Physics is full of such cases of approximate reduction, Dirac' s equation with Pauli's equation and the latter with that of Schroedinger's equation are similar cases of deductively related theories with some additional assumptions.[5]
(d) The value of a theory in physics or in any other sciences is however ultimately judged by its empirical import. Thus the superiority of a theory T1 over T can not be judged independent of facts. Indeed, deduction is important because it is a truthpreserving procedure but in empirical science its importance is in a different level since both T and Tl may be false but still valuable. So we can formulate the concept of empirical superiority of T1 over T in a way such that,
(i) every empirical success of T can be empirical success of T1 but every empirical success of T1 is not empirical success of T. Empirical success is generic to include explanation of some phenomenon, solving same theoretical or practical problems etc.
(ii) there are no empirical success of T such that they can not be success for T1.
Popper formulated such adequacy conditions for theory choice; he called it corroborative value of a theory. More corroborated (empirically established) a theory, more acceptable it is. But unfortunately this emphasis on empirical import is regarded by his critics as shifting ground from his original falsificability test/criterion to allow induction to sneak into theory-evaluation.

There are other problems also in connection with empirical success and failure. For instance, what would be the criterion for such success? Secondly[6] an earlier theory may sometimes explain something which a later theory can not. For example, pre-Daltonian phlogiston theory could explain many phenomena and salve many problems which nineteenth century Daltonian chemistry could not salve. The other problem in this respect is to identify the conditions relating to any corpus of empirical knowledge E in such a way that at least (i) would follow for empirical statements belonging to E.

The issues discussed above are some of the difficulties confronted in comparison of theories which however are not insuperable. Cognitive relativists consider pluralities of theories as enrichment of knowledge; the story of physics however says something different. Most physicists again view physics as characterised by 'a greater conceptual unity today than at any time in its history'. Completing the conceptual unity of physics is regarded as finite task to be solved same day. More and more efforts are seen in search of a unified theory that can explain all the basic problems of physical nature.

Stephen Hawking of Cambridge (and Micheal Green of London, John Schwartz of Caltech, David Gross and Edward Witten of Princeton and many others) attempts to tie up all the laws of the universe by a single principle. To a scientist (physicist) every phenomenon is due to some force. No change (action/reaction) is possible without some force. Till date, four kinds of force/energy ore found : gravitation which brings the apple from a tree to the ground or keep the moan in place round the earth; electro-magnetism which makes the electric fan move; strong force that keeps the nuclear particles of atoms with tremendous force and weak interaction which is responsible for radiation/radio-action.

Our ordinary activities – eating, playing1 sleeping, etc. or motion of cars and planes and machines or chemical reactions – are all due to electromagnetic force. Even the birth and death of stars in the remote galaxies are due to gravitation, electromagnetism and strong force. Without gravitation, nothing would have been in this world; everything would have been gone. Similarly without strong force, neither any object nor any creature would have existed. If protons and neutrons were not kept at the nucleus of atoms, the latter would have been fragjle and disorderly. Weak force however is not so pervasive except in radio-activity of matter. Anyway, these four kinds are behind all physical events.

The further question is why four types of forces? Why not more or less? That means 'four' an arbitrary number. It would be more rational if we can think of one kind of farce that can explain everything. That would also be consistent with the harmony of the universe and uniformity of nature. This belief is almost like the religious belief in one and only one God. To scientists law or theory is like that - ultimately there should be one principle that can explain every variety of events.
That it is not a mere fantasy can be borne by history. Magnet attracting iron particles and the conduction of electricity through copper wire were regarded as two different kinds of action But in 1854 M. Faraday has shown that electricity and magnetism are not two different actions. After that we consider them as the same electro-magnetic farce. Similarly Abdus Salam and Waynberg (in 1967) have established that electro-magnetic and weak force are actually two varieties of the same farce. That means nature suggests unity among diversity. If we progress more we may find a principle that explain different phenomena and also incorporate the different forces within it.

Indeed nature's own puzzle creates much of our confusion about it. Laws are different in micro and macro world of objects. Celestial bodies (stars, planets) move by principle of gravitation whereas electrons, protons follow electro-magnetic, strong and weak force. The latter is explained by quantum mechanics. The relativity theory of gravity does not have anything in common with quantum mechanics. Bath theories are found to be true and working successfully but only in their respective fields. Relativity cannot explain the activity of electrons and protons while celestial bodies do not follow quantum mechanics. Many. Scientists/philosophers hope that unification of gravity and quantum can dispel this puzzle by discovery of a theory that can explain every event from galaxy to neutron. That would be a 'theory of everything', reducing all the four types to the same force and establishing quantum and gravity as the same phenomenon. That theory would be the 'masterkey' to understand the entire universe of macro and micro world of objects. It is also called the final theory. Stephen Hawking thinks that discovery of such a theory would be the end of discovery for physics.

Einstein was the first one to unify gravity with the second force (strong and weak force were not known then) till the end of his life because law was God to him and 'God does not play with dice'. Therefore there can not be two Gods or two ultimate theories. In the seventies Hawking talked about 'super gravity' (an interaction of force among particles following formula similar to Einstein's) . After that 'super string theory' takes over according which particles are not like points but vibrations rather like strings. Abhay Asteker and others however challenge the assumption of such theory that empty space would not be not continuous but more like a sponge or a perforated thing, somewhere it is present, somewhere it is not. But even this is only a metaphor to understand the concept. We should remember that in case of sponge or a cloth the empty place is still space. But in case of space it is not, these are completely empty, no space is there. This new geometry of empty space is moving towards unification of gravity and quantum. Asteker's formula is one of many similar attempts. But whether such unification formula can ever be found is still an open question.

To come back to our original issue of comparison of theories, although 'measuring' of rival theories often involve s ome difficulties, it is possible to compare them an the above mentioned accounts (a - d) . Moreover the incommensurability claim overlooks an important aspect of rival theories namely that the latter have same 'common factors' . If theories are conjectures, what do they aim at ? One thing at least is 'common' between two rival theories, that they both are attempts to salve some common problem, theoretical or practical. Even if theories are not accepted on logical grounds (truth, rationality, objectivity, etc.) - they can at least be compared as to their problem-solving capacity. Indeed the pragmatist/instrumentalist account is also based on such a ground, but my contention here is not exactly the same. A problem can surely be also salved successfully by a rule of thumb. That success can not necessarily raise the epistemic status of a theory. But theory comparison can at least be done on the basis of the common problem they aim to salve. By solution of a problem I mean the way in which it is resolved by theoretical assimilation with other extant theories, background knowledge with minimum ad hoc assumptions and finally with empirical success.

One final comment : without accepting any monolithic concept of rationality one can hold that accepting a theory is not arbitrary or on extra-logical grounds. If this position holds then cognitive relativism (that there are no rational grounds to vindicate one set of beliefs/theories rather than another) requires some more arguments to establish its claim.


II. The Ontological Aspect : Realism Commonsense and Science

In a broad sense realism stands for a position which asserts the existence of some disputed kind of entity (universal materiel objects, causal laws, numbers, probabilities, propensities, etc.). Commonsense realism, cal s~, often called naive realism, represents the pedestrian view that the objects of our knowledge (mostly perceptual experience) like apples and oranges have independent existence. This assumption is taken for granted in our practical life. But how do we know that this is an apple which does not need the knowing mind for its perceptual properties? If the direct relation between the subject and the object holds then why are there so many discripancies, illusion, hallucination, etc. possible? So a reflective approach would be one that would take into account the complete process of various sorts that intervene between the world and the knower.

Scientific realism asserts the existence and activity of scientific objects absolutely independently of the enquiry of which they are the objects or more generally of all human activity. Scientific realism then is the theory that the objects of scientific enquiry exist and act for the most part, quite independently of scientists and their activity. It becomes mandatory thus to make a distinction between the (relatively) unchanging real objects which exist outside and perdure independently of the scientific process and the changing and theoretically imbued cognitive objects which are produced within science as a function and result of the practice. I want to examine this distinction usually associated with commonsense and scientific realism respectively with a view to argue first, that the distinction is overblown and the two approaches are at two cognitive level; so scientific realism does not negate commonsense realism. Second, commonsense realism is unduly made a caricature of it but some amendments can make it more plausible if we consider the complex process through which we know the world. There are good reasons to uphold it as many of us do.

The schism between scientific realism and commonsense realism is held to be so wide that accepting one of these positions seems to negate the other. But if it is realized that scientific realism and commonsense realism posit objects and their behaviours In two different levels - it can be recognised that accepting one world-view does not necessarily exclude the other.

As stated earlier, commonsense realism insists that if there are inter-subjectively shared perceptual experiences of macro-objects like, say, apples and oranges or the sun, then there are apples and oranges and the sun which exist independent of our perceptions. It is clear that this kind of naive realism has the burden to justify the ontological status of mirage and rainbow magic or jugglery. Illusory macro-objects or rather seeming objects pose a serious problem for commonsense realism.
Scientific realism is mainly concerned with the internal infrastructure of things and it attempts to give a causal explanation of our perceptual experiences. It claims to decompose
or analyse macro-objects like apples and oranges into their structural elements, molecules, protons, electrons, etc. Moreover scientific realism appears to draw its high status from the purported 'omnipotence (to borrow John Watkin's terms) of physics' . It assumes that all meaningful questions or problems can be settled at least in principle by physical theories. If there are some seemingly unsettled problems (like, say, apparent miracles), then those questions can be settled by presuming the incompleteness of the present state of physics. It clearly shows either a pre-emptive measure to retain the omniscience of physics or else issuing an unlimited reincheck to it by making the doctrine irrefutable.

The omnipotence claim of physics however is not unquestionable. John Watkins[7] presents some questions for which physics does not have any answer. For example, the question of the 'Now' as distinguished from 'earlier-later', the question of how consciousness arises or why only humans are privileged with the sense of humour in the animal kingdom - are questions physics is unable to answer. In cases of creative or intellectual activities again, the inadequacy of science is more apparent. Science indeed is a good copier but not an inventor of a creative/intellectual value. Garry Kasparov set a good example of it by beating the Super Computer, Deep Blue, in a chess competition in 1996. If the omnipotence of physics can not be refuted, it can not be upheld either. Thus the high status of scientific realism against commonsense also can not remain intact.

Be that as it may, the assertion of scientific rea1ism that only the inner structure is real and the surface features of phenomena are only caused by the infrastructure, assumes the truth of scientific theories. The apparent conflict between commonsense realism and scientific realism is due to the long held dichotomy of structural level as real (as held by Locke) and the surface level as real (as held by Berkeley).

The belief that the entities at the structural level are only real gained credibility if one believes in a 'stopping place' beyond which objects can not be decomposed into microcomponents.. A standard objection to the unreality of macro-objects as given by Graver Maxwell[8] is based on the fact that on analysis of an apple we find subatomic particles that constitute an apple and the relations that subsist among those particles, but not an apple! Therefore the latter does not exist. This argument is not correct. As Watkins shows that by the example of one of Newton's experiment of sun light as an object which can be decomposed by a method (say a prism 1) to components a,b,c (the lights of spectrum) which can be reconstituted by another method (say a prism 2 - suitably placed) as the original object namely, the sun light. The relation of a macro-object and its microparticles can be conceived in different ways:

(a) As a stable and determinate object amidst the flux of the changing and indeterminate.
(b) As a logical construct out of the components.
(c) If the micro-objects and some organising rules or relations among them can reconstruct the object, then the latter can not be just a concept of the object but something real.
(a)is the Kantian way of explaining the 'stability' of 'object' among the fleeting intuitions by the faculty of imagination. (b) is the Russellian move of reducing the reality of an object into its logical components, (c) can avoid the difficulties of both and can also explain aur perception of simulated reality.

Moreover even if the micro-level analysis is corroborated by actual scientific practice, it does not make the ontological status of a macro-object any less stable. Our biological set up is such as to receive some signals from the microparticles of, say, an apple or a piece of art. But we can decode the message in ways which can decipher between a good or a bad apple and appreciate a genuine piece of art. An analysis of an abject into its root cause or prime components can not make the object unreal. After all apples and oranges are more stable and enduring than the subatomic particles these are composed of.

Commonsense realism speaks of objects in a level which is different from the level in which micro-objects are posited. As it is said before, one can also question whether natural science is 'realist' or not, which again can be settled empirically, viz. by determining whether or not scientists believe or behave as if the theoretical terms they employ possess real referents independently of their theorising. Realism is not a theory of knowledge or truth, but of being, although as such it is bound to possess epistemological implications of the objects investigated by the sciences to the effect that they endure and operate independently of human activity and hence of both sense-experience and thought. Every theory of scientific knowledge presupposes a theory of what the world must be like for knowledge under the descriptions given it by the theory to be possible.[9]

In short, it becomes mandatory to make the distinction between the (relatively) unchanging real object which exists outside and endure independently of the scientific processes and the changing cognitive objects as said in the beginning, that are products of the scientific tradition. The so-called difference between 'transitive' and 'intransitive' objects of scientific knowledge is crucial to understand the microlevel dichotomy of reality.

Commonsense realism speaks of macro-objects independent a the scientific process, scientific realism posits micro-object which are cognitive objects within the scientific tradition and thus changes with the changes in scientific theories. Accepting objects at one level does not necessarily negate the reality of the objects at the other level.

One more possible area need our attention in this context: namely, a possible new world in which reality itself might become a manufactured and metered commodity. Virtual reality is not 'just possible' any more. It is not only opening methods of mind amplifications but also widening the vista of spatial experience. Computer-generated simulated reality resembles traditional macro-objects, but we know these are micro-body generated reality. Here the dichotomy of commonsense and scientific realism faces a problem.

Let us now turn to the second component of the issue namely, the cognitive status of commonsense objects. A probe into the scientific theories would lead us there. The principle of complimentarity (of quantum physics) turns the so-called 'elementary building blacks' of classical physics into Janus-faced entities that behave under certain circumstances like hard lumps of matter (particles) in other circumstances as waves or vibrations propagated in a vacuum. With Einstein's magic formula, E=mc², the process of dematerialization of matter started which was completed in the '20s by de Broglie and Schroedinger. The famous formula E=mc² implies that the mass of a particle must not be conceived as some stable elementary material but as a concentrated pattern of energy, locked up in what appears to us as matter. The 'stuff' of which protons are made is nebulous. A. Koestler[10] compared it with stuff of which dreams are made. In the physicist's bubble chambers, highenergy 'elementary' particles collide and annihilate each other or create new particles which give rise to a new chain of events. The particles in question are of course infinitesimally small and shortlived (even shorter than a millionth of a second), yet they leave tracks in the bubble chamber which enable the physicists to decide which of the 'elementary particles' has caused it.

The fundamental lesson which the bubble chamber and other sophisticated instruments teach the physicist is that on the subatomic level, our concepts o space, time, matter and conventional logic can no longer apply. Thus two particles may collide and break into pieces but these pieces may turn out to be not smaller than the original particles - because the kinetic energy liberated in the course of the collision has been transformed into 'mass'. Or a photon, the elementary unit of light, which has no mass can give birth to an eletron-positron pair which does have mass; and that pair might subsequently collide, and by the reverse process transform itself into a photon. The fantastic events in the bubble chamber have been compared to the dance of Shiva[11] with its rhythmic alternations of creation and destruction.

All this is a long way from the simplistic Rutherford-Bohr model of the beginning of the century that represented atoms as miniature solar systems. But we know the model ran into problems, the eletrons were found to behave quite unlike planets, they kept jumping from one orbit to another. The orbits themselves were not well defined trajectories but wide blurred tracks. There are other discrepancies as well.[12] As Russell puts it, the idea of a hard little lump which is the electron or proton is an illegitimate intrusion of commonsense notions derived from touch.[13] "Nuclear matter is thus a form of matter entirely different from anything we experience 'up here' in our macroscopic environment."[14] Atoms are not things, in the sense of concepts like location, velocity, energy, size, etc. Heisenberg comments that when we get down to the atomic level, the objective world in space and time no longer exists.[15]

Lifting our eyes from the bubble chamber to the starry skies, our commonsense notions of space, time and causality turn out to be as inadequate as when we try to apply them to the subatomic domain. If parts of the universe are furnished with galaxies of anti-matter (which consists of atoms in which the electric charges of their constituents are reversed) , there is a fair chance that in these galaxies the flow of time is reversed. Feynmann's diagram shows that particles in microcosms also are supposed to move for a short while backwards in time. Thus our medium-sized world with its homely nations of space, time, causality, appears to be between the mega and micro realms of reality to which these parochial nations do not apply.


The upshot of this account, first of comparability of theories and then of multilevel reality, different concepts and theories that describe it (even quantum reality is description) is to show that

(1) There is no clommon ground to judge which one is really real or more real than the other. Non-Euclidean geometry does not negate Euclidean geometry (which is still a science of terrestrial space) but only shows a way to understand celestial space of spheres. Similarly sub-atomic level reality can not make our commonsense objects like apples and oranges (which still obeys the laws of classical physics) unreal or at least not the proper cognitive object.
(2) It is also difficult to decide why protons and electrons would be of a higher ontological status than apples and oranges. After all, cognitive systems change with new hypotheses and new sets of theoretical entities. But commonsense objects seem to be the same through different sets of theoretical entities.
(3) The greatest problem of commonsense realism can be overcome to some extent if we recognise that the posited objects are not known so directly as usually presented. Even perceptual processes are quite complex and involve lot of theoretical elements. [16] If that makes knowledge of objects indirect we can face the problem of illusion with some success. We can at mast say "we know objects indirectly" - then a possibility of error could be explained along with a fallibilistic but rational account of knowledge. One may recall that Galileo's findings through telescope was not instant success. People regarded it as 'magic' in the beginning because it is so different from naked eye perception.

(4) Finally, anti-realists of various sorts may still say that all our so called 'facts' are constructions, therefore the presupposition of the commonsense view that there is an objective external world is simplistic. But the persistent realist can argue[17] nevertheless that we can still make a distinction between 'brute' fact and 'institutional' fact. Some facts are indeed institutional depending upon culture, language, scientific tradition, paradigm - if you like. But that this is so constructed is contingent upon some brute fact. "Apple is sweet" , "Orange is sour", "Jane is a married lady", "Tiger is a ferocious animal" or "The atomic number of gold is 79" are examples of constructions dependent upon the institution of language, natural classification, scientific knowledge of biology, etc. But that they are so described depend on something brute or irreducible in these things. Commonsense realism claims only this and no more.

[1] All the four senses are based on the entries under 'incommensurable' in the Concise Oxford Dictionary, 1992, Indian edition, Oxford University Press.
[2] E. Schiebe (a physicist~philosopher) argues that the Nobel laureate physicist Wiszacker holds such a view. See his "The Comparisons of Scientific Theories", Interdisciplinary Science Reviews, vol. 11, n°. 2, 1986.
[3] L. Boltzmann first developed such view in the last decade of the last century.
[4] Karl Popper regarded a refuted/suspended theory (such as Newton's) as an approximation of a superseding theory (as Einstein' s). Lakatos maintained a more liberal view about.
[5] These examples are first suggested to me by Professor E. Schiebe, (formerly) of Heidelberg University. He thinks physicists constantly try to reduce laws from other laws. The aim of science in his view is to interrelate different physical theories in a unified way. See his "Coherence and Contingency. Two Neglected Aspects of Theory Succession", Nous, 23, 1989.
[6] I have discussed such cases of 'epistemic loss' in cases of theory change in "Scientific Rationality – A Rethinking", Journal of Indian Council of Philosophical Research, (henceforth, JICPR), vol. VII, n° 1, 1989 ; also in Angèle Kremer Marietti (ed.), Sociologie de la science, Bruxelles : Mardaga, 1998.
[7] John Watkins, 'Commonsense Realism and Scientific Realism', paper presented in the International Colloquium "Rationality in Cognitive and Social Science", New Delhi, January 1992. Published in BSPS 1996.
[8] In 'Scientific Methodology and the Causal Theory of Perception' in Musgrave and Lakatos, (eds.), Problems in the Philosophy of Science, 1968.
[9] John Watkins does not think so. When I wrote this he disagrees with me (in a private correspondence) that a theory of science have any ontological commitment. See also his article, "A Methodology wjthout Presuppositions", in I. C. Jarvie and J. Hall, (eds.), Essays in Honour of Ernest Gellner, CUP, 1992.
[10] A. Koestler, Janus, 1978.
[11] Capra, The Tao of Physics.
[12] These are well-known and well accounted in scientific literature of various kinds.
[13] B. Russell, An Outline of Philosophy.
[14] Capra, The Tao of Physics, p. 77.
[15] W. Heisenberg, Physics and Beyond, pp. 63-64.
[16] An informatian-processing model of knowledge can adequately explain it. Such a model is developed in M. Chaudhury, "Objective Knowledge and Psychologism", JICPR, 1995.
[17] For such arguments, see, M. Chaudhury, "Objective Knowledge and Social Construction", in Facets of Rationalitv, eds. Andler, Chaudhury, et al, Sage, 1995.

XHTML 1.0 Transitional   CSS 2