Quantum Informational Biochemistry. Thought of the Day 71.0

el_net2

A natural extension of the information-theoretic Darwinian approach for biological systems is obtained taking into account that biological systems are constituted in their fundamental level by physical systems. Therefore it is through the interaction among physical elementary systems that the biological level is reached after increasing several orders of magnitude the size of the system and only for certain associations of molecules – biochemistry.

In particular, this viewpoint lies in the foundation of the “quantum brain” project established by Hameroff and Penrose (Shadows of the Mind). They tried to lift quantum physical processes associated with microsystems composing the brain to the level of consciousness. Microtubulas were considered as the basic quantum information processors. This project as well the general project of reduction of biology to quantum physics has its strong and weak sides. One of the main problems is that decoherence should quickly wash out the quantum features such as superposition and entanglement. (Hameroff and Penrose would disagree with this statement. They try to develop models of hot and macroscopic brain preserving quantum features of its elementary micro-components.)

However, even if we assume that microscopic quantum physical behavior disappears with increasing size and number of atoms due to decoherence, it seems that the basic quantum features of information processing can survive in macroscopic biological systems (operating on temporal and spatial scales which are essentially different from the scales of the quantum micro-world). The associated information processor for the mesoscopic or macroscopic biological system would be a network of increasing complexity formed by the elementary probabilistic classical Turing machines of the constituents. Such composed network of processors can exhibit special behavioral signatures which are similar to quantum ones. We call such biological systems quantum-like. In the series of works Asano and others (Quantum Adaptivity in Biology From Genetics to Cognition), there was developed an advanced formalism for modeling of behavior of quantum-like systems based on theory of open quantum systems and more general theory of adaptive quantum systems. This formalism is known as quantum bioinformatics.

The present quantum-like model of biological behavior is of the operational type (as well as the standard quantum mechanical model endowed with the Copenhagen interpretation). It cannot explain physical and biological processes behind the quantum-like information processing. Clarification of the origin of quantum-like biological behavior is related, in particular, to understanding of the nature of entanglement and its role in the process of interaction and cooperation in physical and biological systems. Qualitatively the information-theoretic Darwinian approach supplies an interesting possibility of explaining the generation of quantum-like information processors in biological systems. Hence, it can serve as the bio-physical background for quantum bioinformatics. There is an intriguing point in the fact that if the information-theoretic Darwinian approach is right, then it would be possible to produce quantum information from optimal flows of past, present and anticipated classical information in any classical information processor endowed with a complex enough program. Thus the unified evolutionary theory would supply a physical basis to Quantum Information Biology.

Whitehead’s Anti-Substantivilism, or Process & Reality as a Cosmology to-be. Thought of the Day 39.0

whiteheads-process-philosophy

Treating “stuff” as some kind of metaphysical primitive is mere substantivilism – and fundamentally question-begging. One has replaced an extra-theoretic referent of the wave-function (unless one defers to some quasi-literalist reading of the nature of the stochastic amplitude function ζ[X(t)] as somehow characterizing something akin to being a “density of stuff”, and moreover the logic and probability (Born Rules) must ultimately be obtained from experimentally obtained scattering amplitudes) with something at least as equally mystifying, as the argument against decoherence goes on to show:

In other words, you have a state vector which gives rise to an outcome of a measurement and you cannot understand why this is so according to your theory.

As a response to Platonism, one can likewise read Process and Reality as essentially anti-substantivilist.

Consider, for instance:

Those elements of our experience which stand out clearly and distinctly [giving rise to our substantial intuitions] in our consciousness are not its basic facts, [but] they are . . . late derivatives in the concrescence of an experiencing subject. . . .Neglect of this law [implies that] . . . [e]xperience has been explained in a thoroughly topsy-turvy fashion, the wrong end first (161).

To function as an object is to be a determinant of the definiteness of an actual occurrence [occasion] (243).

The phenomenological ontology offered in Process and Reality is richly nuanced (including metaphysical primitives such as prehensions, occasions, and their respectively derivative notions such as causal efficacy, presentational immediacy, nexus, etc.). None of these suggest metaphysical notions of substance (i.e., independently existing subjects) as a primitive. The case can perhaps be made concerning the discussion of eternal objects, but such notions as discussed vis-à-vis the process of concrescence are obviously not metaphysically primitive notions. Certainly these metaphysical primitives conform in a more nuanced and articulated manner to aspects of process ontology. “Embedding” – as the notion of emergence is a crucial constituent in the information-theoretic, quantum-topological, and geometric accounts. Moreover, concerning the issue of relativistic covariance, it is to be regarded that Process and Reality is really a sketch of a cosmology-to-be . . . [in the spirit of ] Kant [who] built on the obsolete ideas of space, time, and matter of Euclid and Newton. Whitehead set out to suggest what a philosophical cosmology might be that builds on Newton.

“approximandum,” will not be General Theory of Relativity, but only its vacuum sector of spacetimes of topology Σ × R, or quantum gravity as a fecund ground for metaphysician. Note Quote.

1*Sr3ZCgKlan3YlW_n6oBc0w

In string theory as well as in Loop Quantum Gravity, and in other approaches to quantum gravity, indications are coalescing that not only time, but also space is no longer a fundamental entity, but merely an “emergent” phenomenon that arises from the basic physics. In the language of physics, spacetime theories such as GTR are “effective” theories and spacetime itself is “emergent”. However, unlike the notion that temperature is emergent, the idea that the universe is not in space and time arguably shocks our very idea of physical existence as profoundly as any scientific revolution ever did. It is not even clear whether we can coherently formulate a physical theory in the absence of space and time. Space disappears in LQG insofar as the physical structures it describes bear little, if any, resemblance to the spatial geometries found in GTR. These structures are discrete and not continuous as classical spacetimes are. They represent the fundamental constitution of our universe that correspond, somehow, to chunks of physical space and thus give rise – in a way yet to be elucidated – to the spatial geometries we find in GTR. The conceptual problem of coming to grasp how to do physics in the absence of an underlying spatio-temporal stage on which the physics can play out is closely tied to the technical difficulty of mathematically relating LQG back to GTR. Physicists have yet to fully understand how classical spacetimes emerge from the fundamental non-spatio-temporal structure of LQG, and philosophers are only just starting to study its conceptual foundations and the implications of quantum gravity in general and of the disappearance of space-time in particular. Even though the mathematical heavy-lifting will fall to the physicists, there is a role for philosophers here in exploring and mapping the landscape of conceptual possibilites, bringing to bear the immense philosophical literature in emergence and reduction which offers a variegated conceptual toolbox.

To understand how classical spacetime re-emerges from the fundamental quantum structure involves what the physicists call “taking the classical limit.” In a sense, relating the spin network states of LQG back to the spacetimes of GTR is a reversal of the quantization procedure employed to formulate the quantum theory in the first place. Thus, while the quantization can be thought of as the “context of discovery,” finding the classical limit that relates the quantum theory of gravity to GTR should be considered the “context of (partial) justification.” It should be emphasized that understanding how (classical) spacetime re-emerges by retrieving GTR as a low-energy limit of a more fundamental theory is not only important to “save the appearances” and to accommodate common sense – although it matters in these respects as well, but must also be considered a methodologically central part of the enterprise of quantum gravity. If it cannot be shown that GTR is indeed related to LQG in some mathematically well-understood way as the approximately correct theory when energies are sufficiently low or, equivalently, when scales are sufficiently large, then LQG cannot explain why GTR has been empirically as successful as it has been. But a successful theory can only be legitimately supplanted if the successor theory not only makes novel predictions or offers deeper explanations, but is also able to replicate the empirical success of the theory it seeks to replace.

Ultimately, of course, the full analysis will depend on the full articulation of the theory. But focusing on the kinematical level, and thus avoiding having to fully deal with the problem of time, lets apply the concepts to the problem of the emergence of full spacetime, rather than just time. Chris Isham and Butterfield identify three types of reductive relations between theories: definitional extension, supervenience, and emergence, of which only the last has any chance of working in the case at hand. For Butterfield and Isham, a theory T1 emerges from another theory T2 just in case there exists either a limiting or an approximating procedure to relate the two theories (or a combination of the two). A limiting procedure is taking the mathematical limit of some physically relevant parameters, in general in a particular order, of the underlying theory in order to arrive at the emergent theory. A limiting procedure won’t work, at least not by itself, due to technical problems concerning the maximal loop density as well as to what essentially amounts to the measurement problem familiar from non-relativistic quantum physics.

An approximating procedure designates the process of either neglecting some physical magni- tudes, and justifying such neglect, or selecting a proper subset of states in the state space of the approximating theory, and justifying such selection, or both, in order to arrive at a theory whose values of physical quantities remain sufficiently close to those of the theory to be approximated. Note that the “approximandum,” the theory to be approximated, in our case will not be GTR, but only its vacuum sector of spacetimes of topology Σ × R. One of the central questions will be how the selection of states will be justified. Such a justification would be had if we could identify a mechanism that “drives the system” to the right kind of states. Any attempt to finding such a mechanism will foist a host of issues known from the traditional problem of relating quantum to classical mechanics upon us. A candidate mechanism, here and there, is some form of “decoherence,” even though that standardly involves an “environment” with which the system at stake can interact. But the system of interest in our case is, of course, the universe, which makes it hard to see how there could be any outside environment with which the system could interact. The challenge then is to conceptualize decoherence is a way to circumvents this problem.

Once it is understood how classical space and time disappear in canonical quantum gravity and how they might be seen to re-emerge from the fundamental, non-spatiotemporal structure, the way in which classicality emerges from the quantum theory of gravity does not radically differ from the way it is believed to arise in ordinary quantum mechanics. The project of pursuing such an understanding is of relevance and interest for at least two reasons. First, important foundational questions concerning the interpretation of, and the relation between, theories are addressed, which can lead to conceptual clarification of the foundations of physics. Such conceptual progress may well prove to be the decisive stepping stone to a full quantum theory of gravity. Second, quantum gravity is a fertile ground for any metaphysician as it will inevitably yield implications for specifically philosophical, and particularly metaphysical, issues concerning the nature of space and time.