In classical evolutionary biology the fitness landscape for possible strategies is considered static. Therefore optimization theory is the usual tool in order to analyze the evolution of strategies that consequently tend to climb the peaks of the static landscape. However in more realistic scenarios the evolution of populations modifies the environment so that the fitness landscape becomes dynamic. In other words, the maxima of the fitness landscape depend on the number of specimens that adopt every strategy (frequency-dependent landscape). In this case, when the evolution depends on agents’ actions, game theory is the adequate mathematical tool to describe the process. But this is precisely the scheme in that the evolving physical laws (i.e. algorithms or strategies) are generated from the agent-agent interactions (bottom-up process) submitted to natural selection.
The concept of evolutionarily stable strategy (ESS) is central to evolutionary game theory. An ESS is defined as that strategy that cannot be displaced by any alternative strategy when being followed by the great majority – almost all of systems in a population. In general,
an ESS is not necessarily optimal; however it might be assumed that in the last stages of evolution — before achieving the quantum equilibrium — the fitness landscape of possible strategies could be considered static or at least slow varying. In this simplified case an ESS would be one with the highest payoff therefore satisfying an optimizing criterion. Different ESSs could exist in other regions of the fitness landscape.
In the information-theoretic Darwinian approach it seems plausible to assume as optimization criterion the optimization of information flows for the system. A set of three regulating principles could be:
Structure: The complexity of the system is optimized (maximized).. The definition that is adopted for complexity is Bennett’s logical depth that for a binary string is the time needed to execute the minimal program that generates such string. There is no a general acceptance of the definition of complexity, neither is there a consensus on the relation between the increase of complexity – for a certain definition – and Darwinian evolution. However, it seems that there is some agreement on the fact that, in the long term, Darwinian evolution should drive to an increase in complexity in the biological realm for an adequate natural definition of this concept. Then the complexity of a system at time in this theory would be the Bennett’s logical depth of the program stored at time in its Turing machine. The increase of complexity is a characteristic of Lamarckian evolution, and it is also admitted that the trend of evolution in the Darwinian theory is in the direction in which complexity grows, although whether this tendency depends on the timescale – or some other factors – is still not very clear.
Dynamics: The information outflow of the system is optimized (minimized). The information is the Fisher information measure for the probability density function of the position of the system. According to S. A. Frank, natural selection acts maximizing the Fisher information within a Darwinian system. As a consequence, assuming that the flow of information between a system and its surroundings can be modeled as a zero-sum game, Darwinian systems would follow dynamics.
Interaction: The interaction between two subsystems optimizes (maximizes) the complexity of the total system. The complexity is again equated to the Bennett’s logical depth. The role of Interaction is central in the generation of composite systems, therefore in the structure for the information processor of composite systems resulting from the logical interconnections among the processors of the constituents. There is an enticing option of defining the complexity of a system in contextual terms as the capacity of a system for anticipating the behavior at t + ∆t of the surrounding systems included in the sphere of radius r centered in the position X(t) occupied by the system. This definition would directly drive to the maximization of the predictive power for the systems that maximized their complexity. However, this magnitude would definitely be very difficult to even estimate, in principle much more than the usual definitions for complexity.
Quantum behavior of microscopic systems should now emerge from the ESS. In other terms, the postulates of quantum mechanics should be deduced from the application of the three regulating principles on our physical systems endowed with an information processor.
Let us apply Structure. It is reasonable to consider that the maximization of the complexity of a system would in turn maximize the predictive power of such system. And this optimal statistical inference capacity would plausibly induce the complex Hilbert space structure for the system’s space of states. Let us now consider Dynamics. This is basically the application of the principle of minimum Fisher information or maximum Cramer-Rao bound on the probability distribution function for the position of the system. The concept of entanglement seems to be determinant to study the generation of composite systems, in particular in this theory through applying Interaction. The theory admits a simple model that characterizes the entanglement between two subsystems as the mutual exchange of randomizers (R1, R2), programs (P1, P2) – with their respective anticipation modules (A1, A2) – and wave functions (Ψ1, Ψ2). In this way, both subsystems can anticipate not only the behavior of their corresponding surrounding systems, but also that of the environment of its partner entangled subsystem. In addition, entanglement can be considered a natural phenomenon in this theory, a consequence of the tendency to increase the complexity, and therefore, in a certain sense, an experimental support to the theory.
In addition, the information-theoretic Darwinian approach is a minimalist realist theory – every system follows a continuous trajectory in time, as in Bohmian mechanics, a local theory in physical space – in this theory apparent nonlocality, as in Bell’s inequality violations, would be an artifact of the anticipation module in the information space, although randomness would necessarily be intrinsic to nature through the random number generator methodologically associated with every fundamental system at t = 0, and as essential ingredient to start and fuel – through variation – Darwinian evolution. As time increases, random events determined by the random number generators would progressively be replaced by causal events determined by the evolving programs that gradually take control of the elementary systems. Randomness would be displaced by causality as physical Darwinian evolution gave rise to the quantum equilibrium regime, but not completely, since randomness would play a crucial role in the optimization of strategies – thus, of information flows – as game theory states.