The later positivism of logical empiricism
Logical positivism, essentially the doctrine of the Vienna Circle, underwent a number of important changes and innovations in the middle third of the century, which suggested the need for a new name. The designation positivism had been strongly connected with the Comte-Mach tradition of instrumentalism and phenomenalism. The emphasis that this tradition had placed, however, on the positive facts of observation and their negative attitude toward the atomic theory and the existence of theoretical entities in general were no longer in keeping with the spirit of modern science. Nevertheless, the requirement that hypotheses and theories be empirically testable, though it became more flexible and tolerant, could not be relinquished. It was natural, then, that the word empiricism should occur in any new name. Accordingly, retaining the term logical in roughly its same earlier meaning, the new name “logical empiricism” was coined.
The status of the formal and a priori
The intention of the word logical was to insist on the distinctive nature of logical and mathematical truth. In opposition to Mill’s view, according to which even logic and pure mathematics are empirical (i.e., are justifiable or refutable by observation), the logical positivists—essentially following Frege and Russell—had already declared mathematics to be true only by virtue of postulates and definitions. Expressed in the traditional terms used by Kant, logic and mathematics were recognized by logical positivists as “a priori” disciplines (valid independently of experience) precisely because their denial would amount to a self-contradiction, and statements within these disciplines took the form of “analytic propositions”—i.e., propositions that are true or false only by virtue of the meanings of the terms they contain. In his own way, Leibniz had adopted the same view in the 17th century, long before Kant. The truth of such a simple arithmetical proposition as, for example, “2 + 3 = 5” is necessary, universal, a priori, and analytic because of the very meaning of “2,” “+,” “3,” “5,” and “=.” Experience could not possibly refute such truths because their validity is established (as Hume said) merely by the “relation of ideas.” Even if—“miraculously”—putting two and three objects together should on some occasion yield six objects, this would be a fascinating feature of those objects, but it would not in the least tend to refute the purely definitional truths of arithmetic.
The case of geometry is altogether different. Geometry can be either an empirical science of natural space or an abstract system with uninterpreted basic concepts and uninterpreted postulates. The latter is the conception introduced in rigorous fashion by David Hilbert and later by an American geometer, Oswald Veblen. In the axiomatizations that they developed, the basic concepts, called primitives, are implicitly defined by the postulates: thus, such concepts as point, straight line, intersection, betweenness, and plane are related to each other in a merely formal manner. The proof of theorems from postulates, and with explicit definitions of derived concepts (such as of triangle, polygon, circle, or conic section), is achieved by strict deductive inference. Very different, however, is geometry as understood in practical life, and in the natural sciences and technologies, in which it constitutes the science of space. Ever since the development of the non-Euclidean geometries in the first half of the 19th century, it has no longer been taken for granted that Euclidean geometry is the only geometry uniquely applicable to the spatial order of physical objects or events. In Einstein’s general theory of relativity and gravitation, in fact, a four-dimensional Riemannian geometry with variable curvature was successfully employed, an event that amounted to a final refutation of the Kantian contention that the truths of geometry are “synthetic a priori.” With respect to the relation of postulates to theorems, geometry is thus analytic, like any other rigorously deductive discipline. The postulates themselves, when interpreted—i.e., when construed as statements about the structure of physical space—are indeed synthetic but also a posteriori; i.e., their adequacy depends upon the results of observation and measurement.
Developments in linguistic analysis and their offshoots
Important contributions, beginning in the early 1930s, were made by Carnap, by the Austrian-American mathematical logician Kurt Gödel, and others to the logical analysis of language. Charles Morris, a pragmatist concerned with linguistic analysis, had outlined the three dimensions of semiotics (the general study of signs and symbolisms): syntax, semantics, and pragmatics (the relation of signs to their users and to the conditions of their use). Syntactical studies, concerned with the formation and transformation rules of language (i.e., its purely structural features), soon required supplementation by semantical studies, concerned with rules of designation and of truth. Semantics, in the strictly formalized sense, owed its origin to Alfred Tarski, a leading member of the Polish school of logicians, and was then developed by Carnap and applied to problems of meaning and necessity. As Wittgenstein had already shown, the necessary truth of tautologies simply amounts to their being true under all conceivable circumstances. Thus, the so-called eternal verity of the principles of identity (p is equivalent to itself), of noncontradiction (one cannot both assert and deny the same proposition), and of excluded middle (any given proposition is either true or false; there is no further possibility) is an obvious consequence of the rules according to which the philosopher uses (or decides to use) the words proposition, negation, equivalence, conjunction, disjunction, and others. Quite generally, questions regarding the meanings of words or symbols are answered most illuminatingly by stating the syntactical and the semantical rules according to which they are used.
Two different schools of thought originated from this basic insight: (1) the philosophy of “ordinary language” analysis—inspired by Wittgenstein, especially in his later work, and (following him) developed in differing directions by Ryle, J.L. Austin, John Wisdom, and others, and (2) the technique, essentially that of Carnap, usually designated as logical reconstruction, which builds up an artificial language. In the procedures of ordinary-language analysis, an attempt is made to trace the ways in which people commonly express themselves. In this manner, many of the traditional vexatious philosophical puzzles and perplexities are shown to arise out of theoretically-driven misuses or distortions of language. (Lewis Carroll had already anticipated some of these oddities in his whimsical manner in Alice in Wonderland [1865; 1871].) The much more rigorous procedures of the second school—of Tarski, Carnap, and many other logicians—rest upon the obvious distinction between the language (and all of its various symbols) that is the object of analysis, called the object language, and that in which the analysis is formulated, called the metalanguage. If needed and fruitful, the analysis can be repeated—in that the erstwhile metalanguage can become the object of a metametalanguage and so on—without the danger of a vicious infinite regress.
With the help of semantic concepts, an old perplexity in the theory of knowledge can then be resolved. Positivists have often tended to conflate the truth conditions of a statement with its confirming evidence, a procedure which has led to certain absurdities committed by phenomenalists and operationalists, such as the pronouncement that the meanings of statements about past events consist in their (forthcoming future) evidence. Clearly, the objects—the targets or referents—of such statements are the past events. Thus, the meaning of a historical statement is its truth conditions—i.e., the situation that would have to obtain if the historical statement is to be true. The confirmatory evidence, however, may be discovered either in the present or in the future. Similarly, the evidence for an existential hypothesis in the sciences may consist, for example, in cloud-chamber tracks, spectral lines, or the like, whereas the truth conditions may relate to subatomic processes or to astrophysical facts. Or, to take an example from psychoanalysis, the occurrences of unconscious wishes or conflicts are the truth conditions for which the observable symptoms (Freudian lapses, manifest dream contents, and the like) serve merely as indicators or clues—i.e., as items of confirming evidence.
The third dimension of language (in Morris’s view of semiotic)—i.e., the pragmatic aspect—was intensively investigated by Austin and his students, notably John Searle, and extensively developed from the 1960s by philosophers and linguists, including Searle, H.P. Grice, Robert Stalnaker, David Kaplan, Kent Bach, Gerald Levinson, and Dan Sperber and Deirdre Wilson. (See also language, philosophy of: Ordinary language philosophy, and Practical and expressive language.)
One of the most surprising and revolutionary offshoots of the metalinguistic (formal) analyses was Gödel’s discovery, in 1931, of an exact proof of the undecidability of certain types of mathematical problems, a discovery that dealt a severe blow to the expectations of the formalistic school of mathematics championed by Hilbert and his collaborator, Paul Bernays. Before Gödel’s discovery, it had seemed plausible that a mathematical system could be complete in the sense that any well-formed formula of the system could be either proved or disproved on the basis of the given set of postulates. But Gödel showed rigorously (what had been only a conjecture on the part of the Dutch intuitionist L.E.J. Brouwer and his followers) that, for a large class of important mathematical systems, such completeness cannot be achieved.
Both Carnap and Reichenbach, in their very different ways, made extensive contributions to the theory of probability and induction. Impressed with the need for an interpretation of the concept of probability that was thoroughly empirical, Reichenbach elaborated a view that conceived probability as a limit of relative frequency and buttressed it with a pragmatic justification of inductive inference. Carnap granted the importance of this concept (especially in modern physical theories) but attempted, in increasingly refined and often revised forms, to define a concept of degree-of-confirmation that was purely logical. Statements ascribing an inductive probability to a hypothesis are, in Carnap’s view, analytic, because they merely formulate the strength of the support bestowed upon a hypothesis by a given body of evidence.