Criticism of logical positivism
Quine and the “naturalized epistemology”
With article Two Dogmas of Empiricism, Willard Van Orman Quine criticizes two central aspects of logical positivism. The first is the distinction between analytic truths and synthetic truths: there would be true propositions independently of facts, which would be true by virtue of their mere meaning. The second dogma, reductionism, is the theory that meaningful utterances can be reformulated into statements about data of immediate experience (in this case an analytic statement would be a statement confirmed by experience in all cases ).
This text constitutes a true attack against the theoretical inheritance of logical positivism. As Quine himself says, “Another effect is a shift toward pragmatism“: “The two dogmas of empiricism” marks the great return of pragmatism in American philosophy, within the intellectual movement that had ousted it from the intellectual scene: analytic philosophy (in its empiricist form).
With the “naturalized epistemology”, Quine, in a naturalistic point of view, asserts that the philosophy of knowledge and science are themselves a scientific activity, corrected by the other sciences, and not a “primary philosophy” founded on a metaphysics.
Criticism of Mach induction
Inventor of the measurement of the speed of propagation of sound, Ernst Mach developed an epistemological thought that influenced notably Albert Einstein. In The Mechanics, a historical and critical account of its development, Mach unveils the mythological conception that underlies the mechanistic representations of his time and leads to the conflict of spiritualists and materialists. But the criticism of Mach deals chiefly with the method of induction, during deduction. In The Knowledge and the Error (1905), Mach explains that the work of the scientist relates above all to the relations of the objects studied between them, and not on their classification. The research process is above all mental, Mach concludes: “Before understanding the nature, it is necessary to apprehend it in the imagination, to give the concepts a living intuitive content”. Moreover, Mach defends the idea that science is symbolic, a thesis that he takes up again in Karl Pearson’s The Grammar of Science (1892), which explains that science is “a conceptual shorthand”. Mach announces that only the empirical method is scientific:
“We must limit our physical science to the expression of observable facts, without constructing assumptions behind these facts, where nothing exists that can be conceived or proved”
Bertrand Russell introduces the notion of knowledge by acquaintance and knowledge by description into philosophy to designate two basic types of knowledge.
Falsifiability of Karl Popper
The Austrian philosopher Karl Popper (1902-1994) upsets classical epistemology by proposing a new theory of knowledge, as early as 1934 with the Logic of Scientific Discovery. It gives to the epistemology new concepts and tools of examination, such as the falsifiability (ability of a scientific theory to submit to a severe critical method) or infallibility (which defines a contrario the metaphysical, psychoanalytic, Marxist, astrological theories). He proposes to see in falsifiability the criterion for distinguishing between science and non-science. A statement is thus “empirically informative, if and only if it is testable or falsifiable, that is to say if it is possible, at least in principle, that certain facts can contradict it”. Nevertheless, Popper admits that non-falsifiable statements can be heuristic and meaningful (this is the case of the human sciences).
Popper also gives a critique of the thesis of the uniqueness of science, particularly in his book The Logic of Scientific Discovery. The idea of a system of knowledge is futile according to him: “we do not know, we only conjecture.“ The ideal of absolutely certain and demonstrable knowledge turned out to be an idol. According to him, finally, induction has no scientific value:
“There is no induction because universal theories are not deductible from singular statements.”
The “scientific research programs” of Imre Lakatos
The thought of Imre Lakatos (1922-1974) is in line with that of Popper. He is the creator of the notion of “scientific research programs” which is a corpus of theoretical hypotheses linked to a research plan within a particular domain (a “paradigm”) such as Cartesian metaphysics. Lakatos, although a student of Karl Popper, opposes him on the point of falsifiability. According to him, a research program is characterized by a positive heuristic (which defines what to look for and which method to use) and a negative heuristic (the hypotheses are inviolable).
The “Natural Science” of Thomas Kuhn
The work of Thomas Samuel Kuhn will mark a fundamental break in philosophy, history and sociology of science. It will historicize science and reject a fixedist conception of science. His main work in this area, The Structure of Scientific Revolutions (1962) states that “it is thus difficult to consider scientific development as a process of accumulation, because it is difficult to isolate individual discoveries and inventions”.
“When, that is the profession can no longer evade anomalies that subvert the existing tradition of scientific practice — then begin the extraordinary investigations that lead the profession at last to a new set of commitments, a new basis for the practice of science” he adds, describing these practical bases as scientific paradigms (as light considered as a corpuscle, then as a wave, then as a particle). These “extraordinary episodes” are like “scientific revolutions” (as those brought by Isaac Newton, Nicolas Copernicus, Lavoisier, or Einstein): all come to overthrow a dominant paradigm. The state of a science, knowledge and paradigm at a given time is the “normal science” according to Kuhn
“means research firmly based upon one or more past scientific achievements, achievements that some particular scientific community acknowledges for a time as supplying the foundation for its further practice.”
Opposed to any materialist and realistic interpretation of chemistry and physics, Pierre Duhem proposed a conception that will be described as “instrumentalist” of science in The Physical Theory. Its object and its structure (1906). According to instrumentalism, science does not describe reality beyond phenomena but is only a most convenient instrument of prediction.
The epistemological holism of Quine is not limited to physics like that of Duhem, nor even to the experimental sciences like that of Carnap but extends to all science, logic and mathematics included.
The phenomenology of Husserl
For Edmund Husserl, phenomenology takes as its point of departure experience as a sensible intuition of phenomena in order to try to extract the essential provisions of the experiences as well as the essence of what one experiences.
Systemic and epistemological constructivism
The term constructivism was born at the beginning of the 20th century with the Dutch mathematician Brouwer who used it to characterize his position on the question of the foundations of mathematics as a master discipline. But it is especially Jean Piaget who was able to bring constructivism its nobility: with the publication in 1967 of the encyclopedia of the Pléiade and in particular the article Logic and scientific knowledge, it operates according to Jean-Louis Le Moigne a “rebirth of epistemological constructivism, especially from the work of Bachelard”. However, according to Ian Hacking, it was Kant who was the “great pioneer of construction”.
The constructivist school accepts as true only what the scientist can construct, from ideas and hypotheses that intuition (as the foundation of mathematics) accepts as true, and which are representable. The psychologist and epistemologist Jean Piaget will explain that the “fact is (…) always the product of the composition, between the one provided by the objects, and another built by the subject”. Experimentation then serves only to verify the internal coherence of the construction (it is the notion of epistemological model). Piaget, however, will extend the constructivist framework to what he calls the “genetic epistemology” which studies the conditions of knowledge and the laws of its growth, in connection with the neurological development of intelligence. For him, epistemology includes the theory of knowledge and the philosophy of science (what he calls the “circle of science”: every science strengthens the edifice of other sciences). In other words, “the succession of sciences in history obeys the same logic as the ontogeny of knowledge”. Without speaking of total resemblance, the mechanisms, from the individual to the group of researchers and therefore to the scientific disciplines, are common (Piaget cites “reflective abstraction”).
Refusing empiricism, constructivist epistemology posits that knowledge is made by means of a dialectic, from the subject to the object and from the object to the subject, by an experimental back and forth.
Jean Piaget proposed to define epistemology “as a first approximation as the study of the constitution of valid knowledge”, a denomination which, according to Jean-Louis Le Moigne, allows us to ask the three major questions of the discipline:
- What is knowledge and what is its mode of investigation (the “gnoseological” question)?
- How is knowledge created or generated (this is the methodological question)?
- How to appreciate its value or its validity (question of its scientificity)?
This work will inspire several authors. Some, related to the systemic, are published by Paul Watzlawick in 1980 in the book The invention of reality – Contributions to constructivism. Edgar Morin offers to constructivism his “discourse of method” with The Method. Herbert Simon renews the classification of sciences with The sciences of artificial.
Structuralism is a set of holistic currents in epistemology appeared mainly in human and social sciences in the middle of the twentieth century, having in common the use of the term structure as a theoretical model (unconscious, or not empirically perceptible) organizing the form of the studied object taken as a system, the emphasis being placed less on the elementary units of this system than on the relations which unite them. The explicit reference to the term structure, whose definition is not unified between the different currents of thought concerned, is progressively systematized with the institutional construction of the human and social sciences from the second half of the nineteenth century in positivist filiation; however, some authors trace the genealogy of structuralism well before (to Aristotle).
The definition of structuralism and its disciplinary boundaries has become a field of research in its own right, complex and rapidly evolving. Currently, the term tends to designate two types of phenomena:
- in the best known sense (generalized structuralism), a particular period in the history of scientific ideas, a transitory phenomenon of intellectual mode of a protest nature that had taken place between the end of the 1950s and the beginning of the 1970s, extending well beyond academic boundaries to invade the literary, media and political fields; this “structuralist moment”, inspired essentially by Saussurian linguistics and very marked by its formalism, was organized around a small number of leading personalities: Roland Barthes in literature, Jacques Lacan in psychoanalysis, Michel Foucault and Louis Althusser in philosophy (in France);
- in its more specialized epistemological sense, a scientific paradigm close to the systemic one where the notion of structure is centered on the dynamic genesis of systems of the mind and meaning, understood in the sense of the philosophy of form, with a pedigree going back to Aristotle; it is in this naturalistic lineage of structuralism that the ethnologist Claude Lévi-Strauss was located, developing from the 1950s structural anthropology breaking with the currents of Anglo-Saxon anthropology of the time ( evolutionism, diffusionism, culturalism, functionalism).
In this current of thought, the object to be studied is considered as a complex system, that is to say that it is a function of a multitude of parameters and includes inertias, non-linearities, feedbacks, recursions, thresholds, operating sets, mutual influences of variables, delay effects, hysteresis, emergence, self-organization, etc. It is related to its environment, which feeds it into inputs (eg energy and controls) and to whom it gives outputs (eg production and waste).
Henri Poincaré is a precursor of this approach. Edgar Morin and Jean-Louis Le Moigne developed it by their works, writings and conferences.
- enterprise information system;
- decision-making system;
- operating industrial system.
- Old principle (Cartesian) >>> New principle (Complex) >>> Observations
- Evidence >>> Relevance >>> Utility: allows subjectivity, which is important for the purposes pursued
- Reductionism >>> Globalism >>> Part of a larger whole, do not cut or divide because there is loss of information
- Causalism >>> Teleology >>> Understand the behavior of the object in relation to the purposes given to the system by the modeler
- Completeness >>> Agrégativité >>> Grouping using “proven recipes” that simplifies and keeps relevant aspects
Analytical modeling methods must adapt to achieve systemic modeling methods using different vocabulary, concepts, tools, and thought processes:
- Old “tool” (Cartesian) >>> New “tool” (Complex)
- Object >>> Project or process
- Element >>> Active unit
- Together >>> System
- Analysis >>> Design
- Disjunction >>> Conjunction
- Structure >>> Organization
- Optimization >>> Adequacy
- Control >>> Intelligence
- Efficiency >>> Effectiveness
- Application >>> Projection
- Evidence >>> Relevance
- Causal explanation >>> Teleological understanding
- Plan, program >>> Strategy with checkpoints and possible reorientation