The most obvious feature of essentially all financial markets is the apparent randomness with which prices tend to fluctuate. Nevertheless, the very idea of chance in financial markets clashes with our intuitive sense of the processes regulating the market. All processes involved seem deterministic. Traders do not only follow hunches but act in accordance with specific rules, and even when they do appear to act on intuition, their decisions are not random but instead follow from the best of their knowledge of the internal and external state of the market. For example, traders copy other traders, or take the same decisions that have previously worked, sometimes reacting against information and sometimes acting in accordance with it. Furthermore, nowadays a greater percentage of the trading volume is handled algorithmically rather than by humans. Computing systems are used for entering trading orders, for deciding on aspects of an order such as the timing, price and quantity, all of which cannot but be algorithmic by definition.
Algorithmic however, does not necessarily mean predictable. Several types of irreducibility, from non-computability to intractability to unpredictability, are entailed in most non-trivial questions about financial markets.
Wolfram asks
Wolfram points out that pure speculation, where trading occurs without the possibility of any significant external input, often leads to situations in which prices tend to show more, rather than less, random-looking fluctuations. He also suggests that there is no better way to find the causes of this apparent randomness than by performing an almost step-by-step simulation, with little chance of besting the time it takes for the phenomenon to unfold – the time scales of real world markets being simply too fast to beat. It is important to note that the intrinsic generation of complexity proves the stochastic notion to be a convenient assumption about the market, but not an inherent or essential one.
Economists may argue that the question is irrelevant for practical purposes. They are interested in decomposing time-series into a non-predictable and a presumably predictable signal in which they have an interest, what is traditionally called a trend. Whether one, both or none of the two signals is deterministic may be considered irrelevant as long as there is a part that is random-looking, hence most likely unpredictable and consequently worth leaving out.
What Wolfram’s simplified model show, based on simple rules, is that despite being so simple and completely deterministic, these models are capable of generating great complexity and exhibit (the lack of) patterns similar to the apparent randomness found in the price movements phenomenon in financial markets. Whether one can get the kind of crashes in which financial markets seem to cyclicly fall into depends on whether the generating rule is capable of producing them from time to time. Economists dispute whether crashes reflect the intrinsic instability of the market, or whether they are triggered by external events. Sudden large changes are Wolfram’s proposal for modeling market prices would have a simple program generate the randomness that occurs intrinsically. A plausible, if simple and idealized behavior is shown in the aggregate to produce intrinsically random behavior similar to that seen in price changes.
In the figure above, one can see that even in some of the simplest possible rule-based systems, structures emerge from a random-looking initial configuration with low information content. Trends and cycles are to be found amidst apparent randomness.
An example of a simple model of the market, where each cell of a cellular automaton corresponds to an entity buying or selling at each step. The behaviour of a given cell is determined by the behaviour of its two neighbors on the step before according to a rule. A rule like rule 90 is additive, hence reversible, which means that it does not destroy any information and has ‘memory’ unlike the random walk model. Yet, due to its random looking behaviour, it is not trivial shortcut the computation or foresee any successive step. There is some randomness in the initial condition of the cellular automaton rule that comes from outside the model, but the subsequent evolution of the system is fully deterministic.
internally generated suggesting large changes are more predictable – both in magnitude and in direction as the result of various interactions between agents. If Wolfram’s intrinsic randomness is what leads the market one may think one could then easily predict its behaviour if this were the case, but as suggested by Wolfram’s Principle of Computational Equivalence it is reasonable to expect that the overall collective behaviour of the market would look complicated to us, as if it were random, hence quite difficult to predict despite being or having a large deterministic component.
Wolfram’s Principle of Computational Irreducibility says that the only way to determine the answer to a computationally irreducible question is to perform the computation. According to Wolfram, it follows from his Principle of Computational Equivalence (PCE) that
almost all processes that are not obviously simple can be viewed as computations of equivalent sophistication: when a system reaches a threshold of computational sophistication often reached by non-trivial systems, the system will be computationally irreducible.
[…] password it would have to outperform the unfolding processes affecting the market – which, as Wolfram’s PCE suggests, would require at least the same computational sophistication as the market itself, with […]