Category Archives: Nuggets

Nuggets

Book of Nature
Computational Metaphor
Cosmic Natural Selection
Cosmological Constant
Cosmological Self-organization
Deductionism is ultimate reduction
Determinism in Physics
Digital Physics
Evolutionary Epistemology
Fine-tuning Argument
First-order Science
Idealism in Physics
Identity of Indiscernibles
Immanent Reality of Nature
Intentionality in Nature
Mathematical Platonism
Mechanist Philosophy
Multiverse
Natural Agency
Natural Idealism
Nomological Machine
Pancomputationalism
Pre-established Harmony
Principle of Sufficient Reason
Problem of Cognitive Domains
Second-order Science
Textualism
Time-reversibility

Time-reversibility

A system is physically closed when there is no material exchange with an environment. A system is logically closed when no process within it refers to anything outside its definitions. That would include any background against which to measure change. Such a system, however, is a deductive system, an idealization. It appears time-reversible because time has no meaning in a deductive system: every theorem is eternally latent within the axioms. Equations are reversible (by substituting –t for t), but real systems are not. No real system (except, by definition, the universe as a whole) can be considered logically closed or absolutely isolated; there is always an “outside” that constitutes a background against which processes can appear irreversible. Processes within systems considered isolated may appear reversible. In the real world, however, there is no such thing as time-reversibility.

No doubt the concept of a time-reversible system was inspired by special cases, such as celestial motions and simple cyclical mechanisms, where reversing the direction of motion does not affect the overall behavior of the system. Yet, even such reversibility leads to problems, such as the arrow of time or the prevalence in the real world of irreversible processes. (Not to mention the fact that actually reversing the direction of motion of a planet, for example, would be catastrophic!) The fundamental laws of physics are generally “time-reversal invariant,” even though many physical processes at the macroscopic scale seem irreversible, as expressed in the 2nd Law of Thermodynamics. However, strictly speaking, this reversibility is not a property of nature but a mathematical property of equations. One can calculate the behavior of the model backward or forward in time at will. But the model is an invention, not a real system. The problem is not to account for irreversible processes in a theoretically reversible world. Irreversible processes predominate because no part of the real world is actually a deductive system, as represented in such a model.

Textualism

Textualism is the notion that nature itself is in some sense literally a text, a self-contained product of definition like a deductive system or a machine. This is a persistent and largely unacknowledged background assumption inherited by science from the medieval concept of the Book of Nature. Textualism is a thema, in Gerald Holton’s sense. The view of the universe as a result of divine decree, or as a machine running on without the ongoing support of divine will, represented an extension of medieval interest in textual exegesis. In that light, the Book of Nature could best be studied in syntactic terms, specifically with mathematics as the most efficient way to express and interpret the syntax of the world. The idea of the world as text is closely related to that of the world as a divine artifact—indeed, as a machine. Texts, artifacts, and machines are made by design. They possess only the reality defined by their creators and users. Like other artifacts, including machines, a text is a finite, self-enclosed product of definition, containing no more than what was explicitly inscribed by its author along, with deductions implicit within that original specification. If nature is a text, then it should be as predictable as a machine and as searchable as other texts. It should subject to the methods of textual interpretation that were applied to Scripture. Historical time was identified with biblical narrative. The ability to search the text was conflated with the ability to search actual time, making prophecy plausible. Events are predictable in a deterministic system for similar reasons: it is a deductive system, a product of definition, like a text. Whether expressed in Scripture or in the physicist’s equation, the text—while open to interpretation—is determinate and searchable, in a way that nature itself is not.

Second-order Science

If some questions are not decidable within the scientific framework as currently defined, this may be due in part to the exclusion of second-order considerations in “first-order” science. The scientific observer is part of a comprehensive system that includes the apparatus of measurement and the information-carrying medium, as well as the target system. But nowhere is there provision in first-order theory to understand the scientific process itself, let alone the relation between ‘mental’ and ‘physical’, for example. Everything remains implicitly external to the observer, whose role as qua subject is not considered. There is generally no place for reflection on theory making, on methodology, or on implicit assumptions and biases involved both in theory and in experiment. This exclusion may be responsible for a number of the conundrums of modern physics and cosmology—for example, the problem of the cosmological constant and the appearance of fine-tuning. It may be responsible for the general appearance of the quantum world as “weird.” Our expectations of reality are based on experience of the macroscopic realm, which means our interaction with the world as biological organisms. While traditional science studies the object by excluding the subject, second-order science would include the role of the subject as well. This role is necessarily that of an embodied organism. A second-order science would include the observer’s embodied circumstance as a product of evolution. It would be more complete, realistic, and objective than first-order science. It would include knowledge of how first-order theories are to be used. Second-order science would include in its view of natural reality how the activities of science themselves shape that view. Hence, it would view itself as an active player in a larger game, an integral part of the direction of society. For, theory making co-evolves with society itself, both influencing society and influenced by it.

Problem of Cognitive Domains

A domain is the set of elements upon which some operation is to be performed, such as a mathematical function or mapping. A cognitive domain is some defined level of information processing—a data set or domain of description. It may be operated upon and transformed into a new data set, defining a new cognitive domain. This may then serve as the input to some other level of further operations or processing. The problem of cognitive domains is the dilemma of logical circularity that arises when the output of a cognitive process is recycled as its input. This occurs, for example, when the physical world that appears in conscious experience is presupposed in order to explain the brain’s very construction of this appearance in consciousness. The output of mental processing is taken to be its input; a domain of description is recycled as its own rationale. Attempts to explain how the mind builds its picture of the external world typically begin with the very picture of the world they attempt to explain. The whole account of cognition leads up to and includes the physicist’s constructed version of reality, which is then assumed to be the starting point of the story! This is a general epistemic dilemma, which applies not only to perception but to every form of cognitive activity, including science. The scientific story must begin somewhere—with basic theoretical constructs it presumes to be real. These are products of the scientific process (for example, models) yet are recycled as realities in the causal story leading up to the existence of scientists and the scientific story. First-order science cannot come to terms with this shift in domains of description and the consequent circularity. Schopenhauer likened such a bootstrapping operation to the legendary Baron von Munchausen, who—in order to cross the river without drowning—lifted himself and his horse out of the water by pulling up on his own pate!

Principle of Sufficient Reason

The principle of sufficient reason, credited to Leibniz, suggests that there should be an answer to every reasonable question about why the universe is as it is. Everything should have a knowable cause. Yet this faith may be no more than wishful thinking, whose success as an evolutionary strategy in ordinary realms bears no guarantee that it applies universally, particularly in unfamiliar realms. In a weak sense, it may say no more than that human beings are capable of inventing a story to account for every observation. In a stronger sense, it asserts that there is a one-to-one correspondence between thought and reality—between map and territory—even between mathematical models and the real systems they model. In other words, it implies the pre-established harmony. This rigorous correspondence is not possible, however, unless the map is the territory or the territory is nothing but the map itself. For, the very nature of maps is that they are selective and symbolic representations, not literal duplications of the world. The principle of sufficient reason expresses the rationalist faith that the world should be knowable–and mathematically expressible—in any detail. It lies, for example, behind Eintein’s insistence on “completeness” for physical theories, such as the quantum theory. It underwrites the search for a definitive theory of everything. This faith is misplaced, however, if the world is actually real.

Pre-established Harmony

Leibniz took the apparent correspondence between the mental and physical realms to be an act of God, a divinely “pre-established harmony.” In particular, he may have had in mind the correspondence between mathematics and the physical sciences. Physicist Eugene Wigner would later call this harmony “the unreasonable effectiveness of mathematics.” We do marvel even today at the effectiveness of mathematics to model nature. But perhaps the true marvel is the general ability of thought to model the external world at all. After all, there seems to be a correspondence, if not perfect, between sensory perception and reality. This suggests a harmony between world and brain, between ordinary cognition and natural reality. This does not mean that the world is as we see it, or that perception is an open window on the world, only that how we see it facilitates (or at least permits) our survival. Without a roughly reliable correspondence we simply would not be here. It means that the correspondence was built in by evolution through natural selection, through long experience over many generations. The world is obviously suitable for life, since here we are. Very likely it is intelligible for the similar reasons. Embodied participation in an evolutionary contest “pre-establishes” the relations between brain and world, including our very categories of thought. To the degree that logic reflects and abstracts cumulative actual experience, it is to be expected that logical truth should correspond to physical truth. Yet, science seeks out new realms of experience—or, rather, realms that cannot be directly experienced with the senses at all! The fundamental correspondence between mathematics and physical reality remains mysterious because it often holds in applications far removed from ordinary experience. However, a qualifying fact should be kept in mind: the systems investigated as “physical reality” are actually deductive systems—models. That is, physical reality has been re-defined in terms that lend themselves to mathematical expression. On that account, the pre-established harmony is less surprising.

Pancomputationalism

Pancomputationalism is an idealist notion that the whole world is a vast computation, with information as the essential substrate of nature. Faith that physical reality is ultimately comprehensible, and even computable, rests on the assumption that the universe is a computer, a computation, or at least a deductive system. Pancomputationalism simply views nature through the eyes of modern technology. For, the computer is the latest version of the mechanist metaphor, an icon of modern life and thought. A notion of the universe as an analog computer that somehow computes itself from one momentary state to the next is by itself trivial. It tells us no more than ordinary physics with differential equations, or the notion of causality as normally conceived. The operation of such a device (its “programming”) resides in its physical structure. The idea that the universe is a digital computer goes further. Its operation is a distinct affair from its structure. The digital computer must be programmed, and therefore programmed by someone. The program imposes on the machine a (temporary) virtual structure that is separate from its physical architecture and expressible in a binary code. Perhaps this idea finds acceptance because laws of physics have long been thought of as transcendent and separate from nature. Originally they were thought to be divine decrees.

Nomological Machine

The term nomological machine was coined by philosopher Nancy Cartwright. It is the conceptual equivalent of a literal machine. A mathematical model is a nomological machine. And so is a controlled experiment—which may take the form of a literal machine. To be described mathematically at all, a natural system must first be redefined as such a conceptual system. Otherwise, it would not involve clearly defined variables. Deductionism is the belief that all of physical reality can be mapped by such constructs, and even is isomorphic to literal machines such as quantum computers. However, despite the mechanist metaphor, nature is not literally a machine. A controlled experiment is an artificial version of a real situation in nature. Real systems cannot generally be controlled to isolate specific factors of interest, and the number of factors involved is indefinite. A computer simulation is also an artificial version of a natural reality—literally a nomological machine. In that case, deductionism is the belief that all of physical can be simulated by a digital computer, or perhaps by a quantum computer. Extending this to the universe as a whole, and pushed to the extreme, it is the belief that the universe itself is a nomological machine.

Natural Idealism

Natural (or naïve) realism reflects the organism’s faith in the literal truth of cognition. Through projection, the brain’s real-time simulation of the external world is normally and transparently experienced as the world. We experience the world out there and not inside our skulls or on the sensory surfaces of the body. Natural idealism is a complementary phenomenon that does the opposite. It intuits that the true essence of experienced reality consists in ideas, thoughts, or perceptions rather than “material” things. While appearing to be contrary, natural idealism employs similar processes of abstraction, reification, and projection. It is as natural for us to credit ideas and abstractions with reality as to credit sense impressions. Just as natural realism is the basis for the realism of science, so natural idealism is the basis of philosophical idealism, such as held by Plato or Berkeley, for example. Both realism and idealism are assertions of what is “real.” (Which is why mathematical Platonism is sometimes called mathematical realism.)