Complexity and Entropy

04mandernach_4040
Complexity is not characterized by variegated forms of connectivity among the parts that go on to build up the system alone, but is also measured by ‘orderliness’. Entropy is the measure of disorder in a system.(1) Qualitative nature of entropy describes the changes, the system undergoes vis-à-vis its earlier stages by noting energy transformations from one state to the other and this has urged scientists to devise formulas to exactly measure the change or the degree of disorder the system underwent during such transformations. One such exponent is physicist Peter Landsberg, who used inputs from thermodynamics and information theory in arguing that under the constraints operating upon a system, whereby, it was prevented for a system to enter one or more of possible/permitted states, the measure of the total amount of disorder is always generated by using the formula:
Disorder = CD/CI
and
Order = 1 – CO/CI
where CD is the “disorder” capacity of the system, which is the entropy of the parts contained in the permitted ensemble, CI is the “information” capacity of the system, an expression similar to Shannon’s channel capacity (2), and CO is the “order” capacity of the system.
Despite certain strides being made in quantifying complexity using the techniques of entropy, the major hitch remains in the form of diversity in talking about the notion itself, and this affects the use value of entropy in dealing with complexity on a more practical level. Another disadvantage with entropy as measure of complexity would be lack of a detailed structure that the former provides.
(1) There are many ways of depicting or talking about Entropy, like, as based on thermodynamics, randomness or stochasticity and despite being mathematically equivalent are used in context to the systems under study. Thermodynamically, entropy entails system’s susceptibility towards a spontaneous change, with the isolated system never undergoing any decrease in entropy. Entropy is also used in Information Theory, as developed by Claude Shannon to denote the number of bits required for storage and/or communication. For Shannon, entropy quantifies the uncertainty involved in encountering a random variable. An excellent book dealing with the philosophical and physical dimensions of time reversal, second law of thermodynamics, asymmetries in our epistemic access to the past and the future is by David Albert.  
(2) In information theory, channel capacity is the tightest upper bound on the amount of information that can be reliably transmitted over a communications channel. Claude E. Shannon defines the notion of channel capacity and provides a mathematical model by which one can compute it. Capacity of the channel, is given by the maximum of the mutual information between the input and output of the channel, where the maximization is with respect to the input distribution.
Advertisement

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s