The Statistical Physics of Stock Markets. Thought of the Day 143.0

This video is an Order Routing Animation

The externalist view argues that we can make sense of, and profit from stock markets’ behavior, or at least few crucial properties of it, by crunching numbers and looking for patterns and regularities in certain sets of data. The notion of data, hence, is a key element in such an understanding and the quantitative side of the problem is prominent even if it does not mean that a qualitative analysis is ignored. The point here that the outside view maintains that it provides a better understanding than the internalist view. To this end, it endorses a functional perspective on finance and stock markets in particular.

The basic idea of the externalist view is that there are general properties and behavior of stock markets that can be detected and studied through mathematical lens, and they do not depend so much on contextual or domain-specific factors. The point at stake here is that the financial systems can be studied and approached at different scales, and it is virtually impossible to produce all the equations describing at a micro level all the objects of the system and their relations. So, in response, this view focuses on those properties that allow us to get an understanding of the behavior of the systems at a global level without having to produce a detailed conceptual and mathematical account of the inner ‘machinery’ of the system. Hence the two roads: The first one is to embrace an emergentist view on stock market, that is a specific metaphysical, ontological, and methodological thesis, while the second one is to embrace a heuristic view, that is the idea that the choice to focus on those properties that are tractable by the mathematical models is a pure problem-solving option.

A typical view of the externalist approach is the one provided, for instance, by statistical physics. In describing collective behavior, this discipline neglects all the conceptual and mathematical intricacies deriving from a detailed account of the inner, individual, and at micro level functioning of a system. Concepts such as stochastic dynamics, self-similarity, correlations (both short- and long-range), and scaling are tools to get this aim. Econophysics is a stock example in this sense: it employs methods taken from mathematics and mathematical physics in order to detect and forecast the driving forces of stock markets and their critical events, such as bubbles, crashes and their tipping points. Under this respect, markets are not ‘dark boxes’: you can see their characteristics from the outside, or better you can see specific dynamics that shape the trends of stock markets deeply and for a long time. Moreover, these dynamics are complex in the technical sense. This means that this class of behavior is such to encompass timescales, ontology, types of agents, ecologies, regulations, laws, etc. and can be detected, even if not strictly predictable. We can focus on the stock markets as a whole, on few of their critical events, looking at the data of prices (or other indexes) and ignoring all the other details and factors since they will be absorbed in these global dynamics. So this view provides a look at stock markets such that not only they do not appear as a unintelligible casino where wild gamblers face each other, but that shows the reasons and the properties of a systems that serve mostly as a means of fluid transactions that enable and ease the functioning of free markets.

Moreover the study of complex systems theory and that of stock markets seem to offer mutual benefits. On one side, complex systems theory seems to offer a key to understand and break through some of the most salient stock markets’ properties. On the other side, stock markets seem to provide a ‘stress test’ of the complexity theory. Didier Sornette expresses the analogies between stock markets and phase transitions, statistical mechanics, nonlinear dynamics, and disordered systems mold the view from outside:

Take our personal life. We are not really interested in knowing in advance at what time we will go to a given store or drive to a highway. We are much more interested in forecasting the major bifurcations ahead of us, involving the few important things, like health, love, and work, that count for our happiness. Similarly, predicting the detailed evolution of complex systems has no real value, and the fact that we are taught that it is out of reach from a fundamental point of view does not exclude the more interesting possibility of predicting phases of evolutions of complex systems that really count, like the extreme events. It turns out that most complex systems in natural and social sciences do exhibit rare and sudden transitions that occur over time intervals that are short compared to the characteristic time scales of their posterior evolution. Such extreme events express more than anything else the underlying “forces” usually hidden by almost perfect balance and thus provide the potential for a better scientific understanding of complex systems.

Phase transitions, critical points, extreme events seem to be so pervasive in stock markets that they are the crucial concepts to explain and, in case, foresee. And complexity theory provides us a fruitful reading key to understand their dynamics, namely their generation, growth and occurrence. Such a reading key proposes a clear-cut interpretation of them, which can be explained again by means of an analogy with physics, precisely with the unstable position of an object. Complexity theory suggests that critical or extreme events occurring at large scale are the outcome of interactions occurring at smaller scales. In the case of stock markets, this means that, unlike many approaches that attempt to account for crashes by searching for ‘mechanisms’ that work at very short time scales, complexity theory indicates that crashes have causes that date back months or year before it. This reading suggests that it is the increasing, inner interaction between the agents inside the markets that builds up the unstable dynamics (typically the financial bubbles) that eventually ends up with a critical event, the crash. But here the specific, final step that triggers the critical event: the collapse of the prices is not the key for its understanding: a crash occurs because the markets are in an unstable phase and any small interference or event may trigger it. The bottom line: the trigger can be virtually any event external to the markets. The real cause of the crash is its overall unstable position, the proximate ‘cause’ is secondary and accidental. Or, in other words, a crash could be fundamentally endogenous in nature, whilst an exogenous, external, shock is simply the occasional triggering factors of it. The instability is built up by a cooperative behavior among traders, who imitate each other (in this sense is an endogenous process) and contribute to form and reinforce trends that converge up to a critical point.

The main advantage of this approach is that the system (the market) would anticipate the crash by releasing precursory fingerprints observable in the stock market prices: the market prices contain information on impending crashes and this implies that:

if the traders were to learn how to decipher and use this information, they would act on it and on the knowledge that others act on it; nevertheless, the crashes would still probably happen. Our results suggest a weaker form of the “weak efficient market hypothesis”, according to which the market prices contain, in addition to the information generally available to all, subtle information formed by the global market that most or all individual traders have not yet learned to decipher and use. Instead of the usual interpretation of the efficient market hypothesis in which traders extract and consciously incorporate (by their action) all information contained in the market prices, we propose that the market as a whole can exhibit “emergent” behavior not shared by any of its constituents.

In a nutshell, the critical events emerge in a self-organized and cooperative fashion as the macro result of the internal and micro interactions of the traders, their imitation and mirroring.

 

Top-down Causation in Financial Markets. Note Quote.

maelstrom

Regulators attempt to act on a financial market based on the intelligent and reasonable formulation of rules. For example, changing the market micro-structure at the lowest level in the hierarchy, can change the way that asset prices assimilate changes in information variables Zk,t or θi,m,t. Similarly, changes in accounting rules could change the meaning and behaviour of bottom-up information variables θi,m,t and changes in economic policy and policy implementation can change the meaning of top-down information variables Zk,t and influence shared risk factors rp,t.

In hierarchical analysis, theories and plans may be embodied in a symbolic system to build effective and robust models to be used for detecting deeper dependencies and emergent phenomena. Mechanisms for the transmission of information and asymmetric information information have impacts on market quality. Thus, Regulators can impact the activity and success of all the other actors, either directly or indirectly through knock-on effects. Examples include the following: Investor behaviour could change the goal selection of Traders; change in the latter could in turn impact variables coupled to Traders activity in such a way that Profiteers are able to benefit from change in liquidity or use leverage as a mean to achieve profit targets and overcome noise.

Idealistically, Regulators may aim for increasing productivity, managing inflation, reducing unemployment and eliminating malfeasance. However, the circumvention of rules, usually in the name of innovation or by claims of greater insight on optimality, is as much part of a complex system in which participants can respond to rules. Tax arbitrages are examples of actions which manipulate reporting to reduce levies paid to a profit- facilitating system. In regulatory arbitrage, rules may be followed technically, but nevertheless use relevant new information which has not been accounted for in system rules. Such activities are consistent with goals of profiteering but are not necessarily in agreement with longer term optimality of reliable and fair markets.

Rulers, i.e. agencies which control populations more generally, also impact markets and economies. Examples of top-down causation here include segregation of workers and differential assignment of economic rights to market participants, as in the evolution of local miners’ rights in the late 1800’s in South Africa and the national Native Land act of 1913 in South Africa, international agreements such as the Bretton Woods system, the Marshall plan of 1948, the lifting of the gold standard in 1973 and the regulation of capital allocations and capital flows between individual and aggregated participants. Ideas on target-based goal selection are already in circulation in the literature on applications of viability theory and stochastic control in economics. Such approaches provide alternatives to the Laplacian ideal of attaining perfect prediction by offering analysable future expectations to regulators and rulers.

Autopoiesis Revisited

33_10klein1

Autopoiesis principally dealt with determining the essence of living beings to start off with, thus calling to attention a clarification between organization and structure. This distinction was highlighted with organization subtending the set of all possible relations of the autopoietic processes of an organism and structure as a synchronic snapshot from the organizational set that was active at any given instant. This distinction was tension ridden, for a possibility of a production of a novel functional structure was inhibited, and especially so, when the system had perturbations vis-à-vis the environment that housed it. Thus within the realm of autopoiesis, a diachronic emergence was conceivable only as a natural drift. John Protevi throws light on this perspective with his insistence on synchronic emergence as autonomous, and since autonomy is interest directed, the question of autopoiesis in the social realm is ruled out. The case of understanding rejection of extending autopoiesis to the social realm, especially Varela’s rejection, is a move conceived more to move beyond autopoiesis, rather than beyond neocybernetics as concerned with the organizational closure of informational systems, lest a risk of slipping into polarization should loom large. The aggrandizing threat of fascistic and authoritarian tendencies in Varela were indeed ill-conceived. This polarity that Varela considered later in his intellectual trajectory as comprising of fragments that constituted the whole, and collectively constructed, was a launch pad for Luhmann to enter the fray and use autopoiesis to social systems. Autopoiesis forms the central notion for his self-referential systems, where the latter are characterized by acknowledging their referring to themselves in every operation. Autopoietic system while organizationally closed nevertheless references an environment, background or context. This is an indication that pure auto-referentiality is generally lacking, replaced instead by a broader process of self- referentiality which comprises hetero-referentiality with a reference to an environment. This process is watchful of the distinction between itself and the environment, lest it should fail to take off. As Luhmann says that if an autopoietic system did not have an environment, it would be forced to invent one as the horizon of its auto-referentiality.

A system distinguishes itself from the environment by boundaries, where the latter is a zone of high-degree complexity, the former is a one of reduced complexity. Even Luhmann’s system believes in being interest-driven, where the communication is selective with the available information to the best of its efficiency. Luhmann likens the operation of autopoiesis to a program, making a series of logical distinctions. Here, Luhmann refers to the British mathematician G. Spencer Brown’s logic of distinctions that Maturana and Varela had identified as a model for the functioning of any cognitive process. The supreme criteria guiding the “self-creation” of any given system is a defining binary code. This binary code is taken by Luhmann to problematize the auto-referential system’s continuous confrontation with the dilemma of disintegration/continuation. Importantly, Luhmann treats systems on an ontological level, that is, systems exist, and this paradigm is attempted to be changed through the differential relations between the system and the environment.

Philosophically, complexity and self-organizational principles shifts trends into interdisciplinarity. To take a case of holism, emergentism within complexity abhors a study through reductionism. Scientifically, this notion of holism failed to stamp its authority due to a lack of any solid scientificity, and the hubristic Newtonian paradigm of reductionism as the panacea for all ills came to stay. The rapprochement was not possible until a German biologist Ludwig von Bertalanffy shocked the prevalent world view with his thesis on the openness of living systems through interactions with the surrounding systems for their continual survival. This idea deliberated on a system embedded within an environment separated by a boundary that lent the system its own identity. Input from the environment and output from the system could be conceived as a plurality of systems interacting with one another to form a network, which, if functionally coherent is a system in its own right, or a supersystem, with the initial conditions as its subsystems. This strips the subsystems of any independence, but determinable within a network via relations and/or mapping. This in general is termed constraint, that abhors independence from relations between the coupled systems (supersystem/subsystem). If the coupling between the systems is tight enough, an organization with its identity and autonomy results. Cybernetics deals precisely with such a formulation, where the autonomy in question is maintained through goal-directed seemingly intelligent action in line with the thoughts of Varela and Luhmann. This is significant because the perturbations originating in the environment are compensated for by the system actively in order to maintain its preferred state of affairs, with greater the amount of perturbations implying greater compensatory actions on the part of the system. One consequence of such a systemic perspective has gotten rid of Cartesian mind-matter split by thinking of it as nothing more than a special kind of relation. Such is the efficacy of autopoiesis in negotiating the dilemma surrounding the metaphysical question concerning the origin of order.

Complexity Theory and Philosophy: A Peace Accord

neuron_spark

Complexity has impacted fields diverse from the one it originated in, i.e. science. It has touched the sociological domains, and organizational sciences, but sadly, it has not had much of a say in mainstream academic philosophy. In sociology, John Urry (2003) examines the ideas of chaos and complexity in carrying out analyses of global processes. He does this, because he believes that systems are balanced between order and chaos, and that, there is no teleological move towards any state of equilibrium, as the events that pilot the system are not only unpredictable, but also irreversible at the same time. Such events rupture the space-time regularity with their dimension of unpredictability that was thought of as characterizing hitherto known sociological discursive practices. A highly significant contribution that comes along with such an analyses is the distinguishing between what Urry aptly calls “global networks” and “global fluids”. Global fluids are a topographical space used to describe the de-territorialized movement of people, information, objects, finances in an undirected, nonlinear mode, and in a way are characteristic of emergentism and hybridization. The topographies of global networks and global fluids interact in complex manner to give rise to emergent properties that define systems as always on the edge of chaos, pregnant with unpredictability.

emergentism

Cognitive science and evolutionary theory have been inspirational for a lot of philosophical investigations and have also benefited largely from complexity theory. If such is the case, the perplexing thing is complexity theory’s impact in philosophy, which has not had major inroads to make. Why could this be so? Let us ponder this over.

Analytical philosophy has always been concerned with analysis, and logical constructs that are to be stringently followed. These rules and regulations take the domain of philosophical investigations falling under the rubric of analytical tradition away from holism, uncertainty, unpredictability and subjectivity that are characteristics of complexity. The reason why this could be case is attributable to complexity theory as developed on the base of mathematics and computational theories, which, somehow is not the domain of academic philosophy dealing with social sciences and cultural studies in present days, but is confined to discussions and debates amongst philosophers of science (biology is an important branch here), mathematics and technology. Moreover, the debates and deliberations have concerned themselves with the unpredictable and uncertain implications as derived from the vestiges of chaos theory and not complexity theory per se. This is symptomatic of the fact that a lot of confusion rests upon viewing these two path-breaking theories as synonymous, which, incidentally is a mistake, as the former happens at best to be a mere subset of the latter. An ironical fate encountered philosophy, since it dealt with complex notions of language, without actually admitting to the jargon, and technical parlance of complexity theory. If philosophy lets complexity make a meaningful intercourse into its discursive practices, then it could be beneficial to the alliance. And the branch of philosophy that is making use of this intervention and alliance at present is post-modern philosophy ++++

The works of Freud and Saussure as furthered by Lacan and Derrida, not only accorded fecundity for a critique of modernity, but, also opened up avenues for a meaningful interaction with complexity. French theory at large was quite antagonistic to modernist claims of reducing the diverse world to essential features for better comprehensibility, and this essentially lent for its affinity to complexity. Even if Derrida never explicitly used the complexity parlance in his corpus, there appears to be a strong sympathy towards the phenomenon via his take on post-structuralism. On the other hand, Lyotard, in setting his arguments for post-modern conditions of knowledge was ecstatic about paralogy as a defining feature, which is no different from the way complexity, connectionism and distributed systems would harbor.

cc40a61d21aaf144f8e7cf31c50cd31b

Even Deleuze and Guattari are closer to the complex approach through their notions of rhizomes, which are non-reductive, non-hierarchical, and multiplicities oriented connections in data representations and interpretations, and are characterized by horizontal connectivities, as contrasted with arborescent models that find their characterizations in vertical and linear determinations. The ideas are further developed by De Landa (2006), where the attempt is to define a new ontology that could be utilized by social scientists. Components that make up the assemblages are characterized along two axes viz, material, explicating on the variable roles components might undergo, and territorializng/deterritorializing, explicating on processes components might be involved with.

mehretu

Relations of exteriority define components, implying that components are self-subsistent, or that there is never a loss of identity for them, during the process of being unplugged from one assemblage to be plugged into another. This relationship between the assemblages and components is nonlinearly and complexly defined, since assemblages are affected by lower level ones, but could also potentially act on to these components affecting adaptations in them. This is so similar to the way distributed systems are principally modeled. Then why has philosophy at large not shown much impact from complexity despite the French theoretical affinities with the latter?

Chaos theory is partly to blame here, for it has twisted the way a structure of a complex system is understood. The systems have a non-linear operational tendencies, and this has obfuscated the notion of meaning as lying squarely on relativism. The robustness of these systems, when looked at in an illuminating manner from the French theoretical perspective could be advantageous to get rid of ideas about complex systems as based on a knife’s edge, despite being nonlinearly determinable. If the structure of the system were a problematic, then defining limits and boundaries was no easy job. What is the boundary between the system and the environment? Is it rigorously drawn and followed, or is it a mere theoretical choice and construct? These are valid question, which philosophy found it difficult to come to terms with. These questions gained intensity with the introduction of self-organizational systems and/or autopoietic ones. Classical and modern philosophies either had to dismiss these ideas as chimerical, or it had to close off its own analyzing methods in dealing with these issues, and both of these approaches had a detrimental effect of isolating the discipline of philosophy from the cultural domains in which such notions were making positive interventions and inroads. It could safely be said that French theory, in a way tried its rescue mission, and picked up momentum in success. The major contribution from continental philosophy post-60s was framing solutions. Framing, as a schema of interpretation helped comprehending and responding to events and enabled systems and contexts to constitute one another, thus positing a resolution on the boundaries and limits issues that had plagued hitherto known philosophical doctrines.

The notion of difference, so central to modernism was a problematic that needed to be resolved. Such was never a problem within French theory, but was a tonic to be consumed along side complexity, to address socio-economic and political issues. Deleuze (1994), for example, in his metaphysical treatise, sought a critique of representation, and a systematic inversion of the traditional metaphysical notions of identity and difference. Identities were not metaphysically or logically prior to differences, and identities in whatever categories, are pronounced by their derivation from differences. In other words, forms, categories, apperception, and resemblances fail to attain their differences in themselves. And, as Deleuze (2003: 32) says,

If philosophy has a positive and direct relation to things, it is only insofar as philosophy claims to grasp the thing itself, according to what it is, in its difference from everything it is not, in other words, in its internal difference.

But Deleuzean thesis on metaphysics does make a political intervention, like when he says,

The more our daily life appears standardized, stereotyped, and subject to an accelerated reproduction of objects of consumption, the more art must be injected into it in order to extract from it that little difference which plays simultaneously between other levels of repetition, and even in order to make the two extremes resonate — namely, the habitual series of consumption and the instinctual series of destruction and death. (Deleuze 1994: 293).(1)

Tackling the complexity within the social realm head-on does not lie in extrapolating convenient generalities, and thereafter trying to fathom how finely they fit together, but, rather in apprehending the relational schema of the network, within which, individuals emerge as subjects, objects and systems that are capable of grasping the real things.(2) 

One major criticism leveled against complexity is that it is sympathetic to relativism, just like most of the French theoretical thought is. Whether, this accusation has any substance to it could be measured by the likes of circular meaningless debates like the Sokal hoax. The hoax was platitudinous to say the least, and vague at best. And why would this be so? Sokal in his article, “Transgressing the Boundaries: Towards a Transformative Hermeneutics of Quantum Gravity”, incorporated the vocabulary of his specialized discipline to unearth the waywardness of usage by the French theorists. This, for Sokal was fashionable nonsense, or an act of making noise. He takes the French theorists to task for a liberal use of terms like chaos, complexity, quantum, relativity, gender, difference, topology, and deconstruction, without any proper insight. Who would be vague in the Sokal affair? The physicist, or the bunch of French theorists? Such an issue could be tackled on an intelligibility concern. Intelligibility is a result of differentiation and not a guarantee of truth-giving process (Cilliers 2005: 262).

Clearly communicated does not give any indisputable identity to a concept. The only way, (such a meaning can) be meaningful is through limitations being set on such communications, an ethical choice once again. These limitations enable knowledge to come into existence, and this must be accepted de facto. In a parallel metaphoric with complexity, these limitations or constraints are sine qua non for autopoiesis to make an entry. Cilliers (2005: 264) is quite on target, when he lays down the general schema for complexity, if it is, aligned with notions of chaos, randomness and noise, the accusations of relativism and vagueness will start to hold water. It is aligned with notions of structure as the result of contingent constraints, we can make claims about complex systems, which are clear and comprehensible, despite the fact that the claims themselves are historically contingent.

Undoubtedly, complexity rides on modesty. But, the accusations against this position only succeed to level complexity as weak, a gross mistake in itself. Let us take Derrida here, as read by Sweetman (1999). Sweetman cites Derrida as an ideal post-modernist, and thereafter launches an attack on his works as confusing aesthetics with metaphysics, as mistakenly siding with assertions over arguments in philosophy, as holding Derrida for moral and epistemological relativism and, self-contradictory with a tinge of intellectual arrogance. Such accusations, though addressed by Derrida and his scholars at various times, nevertheless find parallels in complexity, where, the split is between proponents of mathematical certainty in dealing with complexity on the one hand, and proponents of metaphorical proclivities in dealing with the phenomenon on the other. So, how would relativism make an entry here? Being a relativist is as good as swimming in paradoxical intellectual currents, and such a position is embraced due to a lack of foundational basis for knowledge, if nothing more. The counter-argument against the relativistic stance of complexity could be framed in a simplistic manner, by citing the case of limited knowledge as not relativistic knowledge. If these forms of knowledge were equated in any manner, it would only help close doors on investigations.

A look at Luhmann’s use of autopoiesis in social theory is obligated here. This is necessitated by the fact of autopoiesis getting directly imported from biological sciences, to which, even Varela had objections, though intellectually changing tracks. Luhmann considers the leaving out of self-referentiality as a problematic in the work of Chileans (Maturana + Varela), since for Luhmann systems are characterized by general patterns which can just be described as making a distinction and crossing the boundary of the distinction [which] enables us to ask questions about society as a self-observing systems[s] (Hayles, K., Luhmann, N., Rasch, W., Knodt, E. & Wolfe, C., 1995 Autumn). Such a reaction from Luhmann is in his response to a cautious undertaking of any import directly from biological and psychological sciences to describe society and social theory. Reality is always distorted through the lens of perception and, this blinds humans from seeing things-in-themselves (the Kantian noumenon). One could visualize this within the analytical tradition of language as a problematic, involving oppositional thinking within the binary structure of linguistic terms themselves. What is required is an evolutionary explanation of how systems survive to the extent that they can learn to handle the inside/outside difference within the system, and within the context of their own operations, since they can never operate outside the system (Hayles, K., Luhmann, N., Rasch, W., Knodt, E. & Wolfe, C., 1995 Autumn). For the social theory to be effective, what requires deconstruction is the deconstruction of the grand tautological claim of autopoiesis, or the unity of the system as produced by the system itself. Luhmann tells us that a methodology that undertakes such a task must do this empirically by identifying the operations which produce and reproduce the unity of the system (Luhmann 1992). This is a crucial point, since the classical/traditional questions as regards the problem of reference as conditioning meaning and truth, are the distinctions between the subject and the object. Luhmann thinks of these questions as quasi-questions, and admonishes a replacement by self-reference/external-reference for any meaningful transformation to take effect. In his communications theory(3), he states flatly that as a system, it depends upon “introducing the difference between system and environment into the system” as the internal split within the system itself that allows it to make the distinction to begin its operative procedures to begin with (Luhmann 1992: 1420). The self-reference/external-reference distinction is a contingent process, and is open to temporal forms of difference. How to define the operation that differentiates the system and organizes the difference between system and environment while maintaining reciprocity between dependence and independence is a question that demands a resolution. The breakthrough for autopoietic systems is provided by the notion of structural coupling, since a renunciation of the idea of overarching causality on the one hand, and the retention of the idea of highly selective connections between systems and environments is effected here. Structural coupling maintains this reciprocity between dependence and independence. Moreover, autopoietic systems are defined by the way they are, by their mode of being in the world, and by the way they overcome or encounter entropy in the world. In other words, a self-perpetuating system performing operational closure continuously are autopoietic systems that organize dynamic stability.

wpid-dznthe-autopoiesis-of-architecture-by-patrik-schumacher-6

Even if the concepts of complexity have not traveled far and wide into the discipline of philosophy, the trends are on the positive side. Developments in cognitive sciences and consciousness studies have a far reaching implications on philosophy of mind, as does in research in science that helps redefine the very notion of life. These researches are carried out within the spectrum of complexity theory, and therefore, there is a lot of scope for optimism. Complexity theory is still in the embryonic stage, for it is a theory of the widest possible extent for our understanding the world that we inhabit. Though, there are roadblocks along the way, it should in no way mean that it is the end of the road for complexity, but only a beginning in a new and novel manner.

Complexity theory as imbibed within adaptive systems has a major role in evolutionary doctrines. To add to this, the phenomenon of French Theory has incited creative and innovative ways of looking at philosophy, where residues of dualism and reductionism still rest, and resist any challenges whatsoever. One of the ways through which complexity and philosophy could come closer is, when the latter starts withdrawing its investigations into the how- ness of something, and starts to seriously incorporate the why-ness of it. The how- ness still seems to be arrested within the walls of reductionism, mechanicism, modernism, and the pillars of Newtonian science. So, an ontological reduction of all phenomenon under the governance of deterministic laws is the indelible mark, even if epistemologically, a certain guideline of objectivity seems apparent. What really is missed out on in this process is the creativity, as world in particular and universe in general is describable as a mechanism following clockwork. Such a view held sway for most the modern era, but with the advent of scientific revolutions in the 20th century, things began to look awry. Relativity theory, quantum mechanics, chaos, complexity, and recently string/M-theory were powerful enough in their insights to clean off the hitherto promising and predictable scientific ventures. One view at quantum mechanics/uncertainty and chaos/non-linear dynamics was potent to dislodge predictability from science. This was followed in succession by systems theory and cybernetics, which were instrumental in highlighting the scientific basis for holism and emergence, and showing equally well that knowledge was intrinsically subjective. Not just that, autopoiesis clarified the picture of regularity and organization as not given, but, rather dependent on a dynamically emergent tangle of conflicting forces and random fluctuations, a process very rightly referred to by Prigogine and Stengers (1984) as “order out of chaos”. In very insightful language, Heylighen, Cilliers and Gershenson (2007) pin their hopes on these different approaches, which are now starting to become integrated under the heading of “complexity science”. It’s central paradigm is the multi-agent system: a collection of autonomous components whose local interactions give rise to a global order. Agents are intrinsically subjective and uncertain about the consequences of their actions, yet they generally manage to self-organize into an emergent, adaptive system. Thus uncertainty and subjectivity should no longer be viewed negatively, as the loss of the absolute order of mechanicism, but positively, as factors of creativity, adaptation and evolution….Although a number of (mostly post-modern) philosophers have expressed similar sentiments, the complexity paradigm still needs to be assimilated by academic philosophy.

Such a need is a requisite for complexity to become more aware about how modeling techniques could be made more robust, and for philosophy to understand and resolve some hitherto unaddressed, but perennial problems.

———————————————————–

1  The political implications of such a thesis is rare, but forceful. To add to the quote above, there are other quotes as well, that deliberate on socio-political themes. Like,

“We claim that there are two ways to appeal to ‘necessary destructions’: that of the poet, who speaks in the name of a creative power, capable of overturning all orders and representations in order to affirm Difference in the state of permanent revolution which characterizes eternal return; and that of the politician, who is above all concerned to deny that which ‘differs,’ so as to conserve or prolong an established historical order.” (Deleuze 1994: 53).

and,

“Real revolutions have the atmosphere of fétes. Contradiction is not the weapon of the proletariat but, rather, the manner in which the bourgeoisie defends and preserves itself, the shadow behind which it maintains its claim to decide what the problems are.” (Deleuze 1994: 268).

2 It should however be noted, that only immanent philosophies of the sort Deleuze propagates, the processes of individuation could be accounted for. Moreover, once such an aim is attained, regularities in the world are denied any eternal and universal validation.

3 He defines communication as “a kind of autopoetic network of operations which continually organizes what we seek, the coincidence of self-reference (utterance) and external reference (information)” (1992: 1424). He details this out saying,

“Communication comes about by splitting reality through a highly artificial distinction between utterance and information, both taken as contingent events within an ongoing process that recursively uses the results of previous steps and anticipates further ones”. (1992: 1424).

Bibliography

Ciliers, P. (2005) Complexity, Deconstruction and Relativism. In Theory, Culture & Society, Vol. 22 (5). pp. 255 – 267.

De Landa, M. (2006) New Philosophy of Society: Assemblage Theory and Social Complexity. London: Continuum.

Deleuze, G. (1994) Difference and Repetition. Translated by Patton, P. New York: Columbia University Press.

—————- (2003) Desert Islands and Other Texts (1953-1974). Translated by Taormina, M. Los Angeles: Semiotext(e).

Hayles, K., Luhmann, N., Rasch, W., Knodt, E. & Wolfe, C. (1995 Autumn) Theory of a Different Order: A Conversation with Katherine Hayles and Niklas Luhmann. In Cultural Critique, No. 31, The Politics of Systems and Environments, Part II. Minneapolis, MN: University of Minnesota Press.

Heylighen, F., Cilliers, P., and Gershenson, C. (2007) The Philosophy of Complexity. In Bogg, J. & Geyer, R. (eds), Complexity, Science and Society. Oxford: Radcliffe Publishing.

Luhmann, N (1992) Operational Closure and Structural Coupling: The Differentiation of the Legal System. Cardoza Law Review Vol. 13.

Lyotard, J-F. (1984) The Postmodern Condition: A Report on Knowledge. Translated by Bennington, G. & Massumi, B. Minneapolis, MN: University of Minnesota Press.
Prigogine, I. and Stengers, I. (1984) Order out of Chaos. New York: Bantam Books.

Sweetman, B. (1999) Postmodernism, Derrida and Différance: A Critique. In International Philosophical Quarterly XXXIX (1)/153. pp. 5 – 18.

Urry, J. (2003) Global Complexity. Cambridge: Polity Press.