Albert Camus Reads Richard K. Morgan: Unsaid Existential Absurdism

Humanity has spread to the stars. We set out like ancient seafarers to explore the limitless ocean of space. But no matter how far we venture into the unknown, the worst monsters are those we bring with us. – Takeshi Kovacs

What I purport to do in this paper is pick up two sci-fi works of Richard Morgan, Altered Carbon (teaser to Netflix series), the first of Takeshi Kovacs trilogy and sometimes a grisly tale of switching bodies to gain immortality transhumanism, either by means of enhanced biology, technology, or biotechnology, and posthumanism. The second is Market Forces, a brutal journey into the heart of corporatized conflict investment by way of conscience elimination. Thereafter a conflation with Camus’ absurdity unravels the very paradoxical ambiguity underlying absurdism as a human condition. The paradoxical ambiguity is as a result of Camus’ ambivalence towards the neo-Platonist conception of the ultimate unifying principle, while accepting Plotinus’ principled pattern, but rejecting its culmination.

Richard Morgan’s is a parody, a commentary, or even en epic fantasy overcharged almost to the point of absurdity and bordering extropianism. If at all there is a semblance of optimism in the future as a result of Moore’s Law of dense hardware realizable through computational extravagance, it is spectacularly offset by complexities of software codes resulting in a disconnect that Morgan brilliantly transposes on to a society in a dystopian ethic. This offsetting disconnect between the physical and mental, between the tangible and the intangible is the existential angst writ large on the societal maneuvered by the powers that be.

Morgan’s Altered Carbon won’t be a deflection from William Gibson’s cyberpunk, or at places even Philip K Dick’s Do Androids Dream of Electric Sheep?, which has inspired the cult classic Ridley Scott’s Blade Runner, wherein the interface between man and machine is coalescing (sleeves as called in the novel), while the singularity pundits are making hay. But, what if the very technological exponent is used against the progenitors, a point that defines much of Artificial Intelligence ethics today? What if the human mind is now digitized, uploaded and downloaded as a mere file, and transferred across platforms (by way of needlecast transmitting DHF, individual digital human freight) rendering the hardware dis- posable, and at the same time the software as a data vulnerable to the vagaries of the networked age? These aren’t questions keeping the ethic at stake alone, but rather a reformatting of humanity off the leash. This forever changes the concept of morality and of death as we know it, for now anyone with adequate resources (note the excess of capitalism here) can technically extend their life for as long as they desire by reserving themselves into cloned organics or by taking a leaf off Orwell’s Government to archive citizen records in perpetual storage. Between the publication in 2002 and now, the fiction in science fiction as a genre has indeed gotten blurred, and what has been the Cartesian devil in mind-body duality leverages the technological metempsychosis of consciousness in bringing forth a new perception on morality.

Imagine, the needle of moral compass behaving most erratically, ranging from extreme apathy to moderate conscience in consideration of the economic of collateral damage, with the narrative wrenching through senses, thoughts and emotions before settling down into a dystopian plot dense with politics, societal disparity, corruption, abuse of wealth and power, and repressively conservative justice. If extreme violence is distasteful in Altered Carbon, the spectacle is countered by the fact that human bodies and memories are informational commodities as digitized freight and cortical stacks, busted and mangled physical shells already having access to a sleeve to reincarnate and rehabilitate on to, opening up new vistas of philosophical dispositions and artificially intelligent deliberation on the ethics of fast-disappearing human-machine interface.

If, Personal is Political, Altered Carbon results in a concussion of overloaded themes of cyberpunk tropes and is indicative of Morgan’s political takes, a conclusion only to be commissioned upon reading his later works. This detective melange heavily slithers through human condition both light and dark without succumbing to the derivatives of high-tech and low-life and keeping the potentials of speculative fiction to explorations. The suffusive metaphysics of longevity, multiplicity of souls and spiritual tentacles meeting its adversary in Catholicism paints a believable futuristic on the canvass of science-fiction spectra.

Market Forces, on the other hand is where cyberpunk-style sci-fi is suddenly replaced with corporatized economy of profit lines via the cogency of conflict investment. The world is in a state of dysphoria with diplomatic lines having given way to negotiations with violence, and contracts won on Ronin-esque car duels shifting the battlefield from the cyberspace of Altered Carbon to the more terrestrial grounds. Directly importing from Gordon Gekko’s “Greed is Good”, corporates enhance their share of GDP via legal funding of foreign wars. The limits of philosophy of liberal politics are stretched on analogizing the widening gap between the rich and the marginalized in the backdrop of crime-ravaged not-so futuristic London. Security is rarefied according to economic stratifications, and surveillance by the rich reach absurd levels of sophistication in the absence of sousveillance by the marginalized.

Enter Chris Faulkner, the protagonist defined by conscience that starts to wither away when confronted with taking hard and decisive actions for his firm, Shorn Associates, in the face of brutality of power dynamics. The intent is real-life testosterone absolutism maximizing the tenets of western capitalism in an ostentatious exhibition of masculinity and competition. The obvious collateral damage is fissuring of familial and societal values born as a result of conscience. Market Forces has certain parallels from the past, in the writings of Robert Sheckley, the American sci-fi author, who would take an element of society and extrapolate on its inherent violence to the extent of the absurd sliding into satire. It’s this sliding wherein lies the question of the beyond, the inevitability of an endowment of aggression defining, or rather questioning the purpose of the hitherto given legacy of societal ethic.

With no dearth of violence, the dystopian future stagnates into dysphoria characterized by law and apparatus at the mercy of corporations, which transcend the Government constitutionally along rapacious capitalism. A capitalism that is so rampant that it transforms the hero into an anti-hero in the unfolding tension between interest and sympathy, disgust and repulsion. The perfectly achievable Market Forces is a realization round the corner seeking birth between the hallucinogenic madness of speculations and hyperreality hinging on the philosophy of free-markets taken to its logical ends in the direction of an unpleasant future. The reductio ad absurdum of neoliberalism is an environment of feral brutality masked with the thinnest veneer of corporate civilization, and is the speculation that portrays the world where all power equates violence. This violence is manifested in aggression in a road rage death match against competitors every time there is a bid for a tender. What goes slightly over the board, and in a pretty colloquial usage of absurdity is why would any competition entail the best of staff on such idiotic suicide missions?

Camus’ absurdity is born in The Myth of Sisyphus, and continues well into the The Rebel, but is barely able to free itself from the clutches of triviality. This might appear to be a bold claim, but the efficacy is to be tested through Camus’ intellectual indebtedness to Plotinus, the Neo-Platonist thinker. Plotinus supplemented the One and Many idea of Plato with gradations of explanatory orders, for only then a coalescing of explanations with reality was conceivable. This coalescing converges into the absolute unity, the One, the necessarily metaphysical ground. Now, Camus accepts Plotinus in the steganographic, but strips the Absolute of its metaphysics. A major strand of absurdity for Camus stems from his dic- tum, “to understand is, above all, to unify”, and the absence of such unifying principle vindicates absurdity. Herein, one is confronted with the first of paradoxes, in that, if the Absolute is rejected, why then is there in Camus a nostalgia for unity? The reason is peculiarly caught between his version of empiricism and monism. His empiricism gives accord to comprehensibility of ordinary experiences by way of language and meaning, while anything transcending the same is meaninglessness and hinges on the Plotinus’ Absolute for comprehensibility, thus making him sound a monist. Add to this contradiction is the face of the Christian God to appear if the Absolute were not to be rejected, which would then have warranted a clash between good and evil in the face of the paradox of the existing of the latter when God was invested with qualities of the former. Invoking modernism’s core dictum, Camus then, questions spontaneity in the presence of Absolute by calling to attention scholastic perplexity.

Having rejected the Absolute, Camus takes the absurd condition as a fact. If one were to carefully tread The Myth of Sisyphus, it works thusly: If a man removes himself, he destroys the situation and hence the absurd condition. Since, the absurd condition is taken as a fact, one who destroys himself denies this fact. But he who denies this fact puts himself in opposition to what is, Truth. To oppose the Truth, recognizing it to be true, is to contradict oneself. Recognizing a truth, one ought to preserve it rather than deny it. Therefore, it follows that one ought not to commit metaphysical suicide in the face of the meaningless universe. This is a major paradox in his thought, where the evaluative absurdity is deemed to be preserved starting from the premise that man and the universe juxtaposed together is absurdity itself. So, what we have here is a logical cul-de-sac. But, what is of cardinal import is the retention of life in mediating between the man and universe as absurdity in polarities. If this were confronting the absurd in life, eschatology is another confrontation with the absurd, an absolute that needs to be opposed, a doctrine that becomes a further sense of the absurd, an ethic of the creation of the absolute rule in a drama of man as a struggle against death.

It is this conjecture that builds up in The Rebel, death as an antagonist subjected to rebellion. The absurdity of death lies across our desire for immortality, the inexplicability of it, and negating and denying the only meaningful existence known. Contradictorily, death would not be absurd if immortality were possible, and existence as is known isn’t the only meaningful existence that there is. Camus is prone to a meshwork logic here, for his thought fluctuates between viewing death as an absolute evil and also as a liberator, because of which it lends legitimacy to freedom. For, it isn’t the case that Camus is unaware of the double bind of his logic, and admittedly he ejects himself out of this quandary by deliberating on death not as a transcendental phenomenon, but as an ordinary lived-experience. If the Myth of Sisyphus holds murder and suicide in an absurdist position by denying the transcendent source of value, The Rebel revels in antagonisms with Nihilism, be it either in the sense of nothing is prohibited, or the absolutist nihilism of “permit all” with a fulcrum on the Absolute. The Rebel epitomizes the intellectual impotency of nihilism. But due credit for the logical progression of Camus is mandated here, for any utopia contains the seed of nihilism, in that, any acceptance of an Absolute other than life ultimately leads to tyranny. If this were to be one strand in the essay, the other is exposited in terms of an unrelenting and absolute opposition to death. Consequently, The Rebel, which is the embodiment of Camus’ ethic cannot kill. To avoid any danger of absolutism in the name of some positive good or value, the absolute value becomes oppositional to death, and hence the Rebel’s ethic is one of ceaseless rebellion, opposition and conflict.

Now, with a not very exhaustive treatment to Camus’ notion of absurdity as there is more than meets the eye in his corpus, let us turn to conflation with Richard Morgan and justify our thesis that we set out with. We shall bring this about by a series of observations.

If antagonism to death is the hallmark of rebellion, then Altered Carbon with its special hard-drives called “Stacks” installed in the brainstem immortalizes consciousness to be ported across humans across spacetimes. Needlecasting, the process by which human consciousness in the format of data gets teleported suffers disorientation across human hardwares, if it could even be called that. Interestingly, this disorientation aggrandizes the receiver conflict-ready, a theme that runs continuously in Market Forces as well as in Altered Carbon. The state of being conflict- and combat-ready is erecting armies to quash down rebellions. To prevent immortality from getting exploited in the hands of the privileged, these armies are trained to withstand torture, drudgery, while at the same time heightening their perception via steganography. But where the plot goes haywire for Camus’ rebel is Richard Morgan’s can neutralize and eliminate. Thats the first observation.

On to the second, which deals with transhumanism. A particular character, Kovac’s partner Kristen Ortega has a neo-Catholic family that’s split over God’s view of resurrecting a loved one. The split is as a result of choosing religious coding, a neo-Catholic semblance being the dead cannot be brought back to life. In these cases, Altered Carbon pushes past its Blade Runner fetish and reflexive cynicism to find something human. But, when the larger world is so thin, it’s hard to put something like neo-Catholicism in a larger context. Characters have had centuries to get used to the idea of stacks begging the larger question: why are many still blindsided by their existence? And why do so few people, including the sour Meths, seem to be doing anything interesting with technology? Now Camus’ man is confronted with his absurd and meaningless existence, which will be extinguished by death. There are two choices to consider here: either he can live inauthentically, implying hiding from truth, the fact that life is meaningless, and accepting the standards and values of the crowd, and in the process escaping the inner misery and despair that results from an honest appraisal of facts. Or, he can take the authentic choice and live heroically, implying facing the truth, life’s futility, and temporarily, submitting to despair which is a necessary consequence, but which, if it does not lead to suicide, will eventually purify him. Despair will drive him out of himself and away from trivialities, and by it he will be impelled to commit himself to a life of dramatic choices. This is ingrained in the intellectual idea of neo-Catholicism, with Camus’ allusion as only the use of the Will can cause a man truly to be. Both Takeshi Kovacs in Altered Carbon and Chris Faulkner in Market Forces amply epitomize this neo-Catholicism, albeit not directly, but rather, as an existential angst in the form of an intrusion.

Now for the third observation. The truth in Altered Carbon is an excavation of the self, more than searching data and tweaking it into information. It admonishes to keep going no matter whichever direction, a scalar over the vector territorialization in order to decrypt that which seems hidden, an exercise in futility. Allow me to quote Morgan in full,

You are still young and stupid. Human life has no value. Haven’t you learned that yet, Takeshi, with all you’ve seen? It has no value, intrinsic to itself. Machines cost money to build. Raw materials cost money to extract. But people? You can always get some more people. They reproduce like cancer cells, whether you want them or not. They are abundant, Takeshi. Why should they be valuable? Do you know that it costs us less to recruit and use up a real snuff whore that it does to set up and run virtual equivalent format. Real human flesh is cheaper than a machine. It’s the axiomatic truth of our times?

In full consciousness and setting aside the impropriety above, Morgan’s prejudicing the machine over human flesh extricates essentialism, mirroring Camusian take on the meaning of life as inessential, but for the burning problem of suicide. This is a direct import from Nietzsche, for who, illusion (the arts, Remember Wagner!) lends credibility to life and resolves despair to some extent, whereas for Camus, despair is only coming to terms with this absurd condition, by way of machination in the full knowhow of condition’s futility and pointlessness. This fact is most brilliantly corroborated in Morgan’s dictum about how constant repetition can even make the most obvious truths irritating enough to disagree with (Woken Furies).

To conclude: Imagine the real world extending into the fictive milieu, or its mirror image, the fictive world territorializing the real leaving it to portend such an intercourse consequent to an existential angst. Such an imagination now moves along the coordinates of hyperreality, where it collaterally damages meaning in a violent burst of EX/IM-plosion. This violent burst disturbs the idealized truth overridden by a hallucinogenic madness prompting iniquities calibrated for an unpleasant future. This invading dissonant realism slithers through the science fiction before culminating in the human characteristics of expediency. Such expediencies abhor fixation to being in the world built on deluded principles, where absurdity is not only a human condition, but an affliction of the transhuman and posthuman condition as well. Only the latter is not necessarily a peep into the future, which it might very well be, but rather a disturbing look into the present-day topographies, which for Camus was acquiescing to predicament, and for Richard Morgan a search for the culpable.

Advertisement

Politics of Teleonomies of Blockchain…Thought of the Day 155.0

DrBSBLdXQAESkFp.jpg

All of this starts with the dictum, “There are no men at work”.

The notion of blockchain is a decentralized polity. Blockchain is immutable, for once written on to the block, it is practically un-erasable. And most importantly, it is collateralized, in that, even if there is a lack thereof of physical assets, the digital ownership could be traded as a collateral. So, once you have a blockchain, you create a stack that could be database controlled using a Virtual Machine, think of it as some sort of digital twin. So, what exactly are the benefits of this decentralized digital polity? One crucial is getting rid of intermediaries (unless, one considers escrow accounts as an invisible intermediary!, which seldom fulfills the definitional criteria). So, in short, digital twinning helps further social scalability by getting intermediaries o to an invisible mode. Now, when blockchains are juxtaposed with algorithmically run machines (AI is just one branch of it), one gets the benefits of social scalability with analytics, the ever-increasing ocean of raw data hermeneutically sealed into information for utilitarian purposes. The advantages of decentralized polity and social scalability compiles for a true democratic experience in an open-sourced modeling, where netizens (since we still are mired in the controversy of net neutrality) experience participatory democracy.
How would these combine with exigencies of scarce nature or resources? It is here that such hackathons combine the ingenuity of blockchain with AI in a process generally referred to as “mining”. This launch from the nature as we know is Nature 2.0. To repeat, decentralized polity and social scalability creates a self-sustaining ecosystem in a sense of Anti-Fragility (yes, Taleb’s anti-fragile is a feedback into this) with autonomously created machine learning systems that are largely correctional in nature on one hand and improving learning capacities from the environment on the other. These two hands coordinate giving rise to resource manipulation in lending a synthetic definition of materialities taken straight from physics textbooks and scared-to-apprehend materialities as thermodynamic quotients. And this is where AI steams up in a grand globalized alliance of machines embodying agencies always looking for cognitive enhancements to fulfill teleonomic life derived from the above stated thermodynamic quotient of randomness and disorder into gratifying sensibilities of self-sustenance. Synthetic biologists (of the Craig Venter and CRISPR-like lines) call this genetic programming, whereas singularitarians term it as evolution, a break away from simulated evolution that defined initial days of AI. The synthetic life is capable of decision making, the more it is subjected to the whims and fancies of surrounding environment via the process of machine learning leading to autonomous materialities with cognitive capabilities. These are parthenogenetic machines with unencumbered networking capacities. Such is the advent of self-ownership, and taking it to mean to nature as we have hitherto known is a cathectic fallacy in ethics. Taking to mean it differently in a sense of establishing a symbiotic relationship between biology and machines to yield bio machines with characteristics of biomachinations, replication (reproduction, CC and CV to be thrown open for editing via genetic programming) and self-actualization is what blockchain in composite with AI and Synthetic Biology is Nature 2.0.
Yes, there are downsides to traditional mannerisms of thought, man playing god with nature and so on and so on…these are ethical constraints and thus political in undertones, but with conservative theoretics and thus unable to come to terms with the politics of resource abundance that the machinic promulgates…

Algorithmic Trading. Thought of the Day 151.0

HFT order routing

One of the first algorithmic trading strategies consisted of using a volume-weighted average price, as the price at which orders would be executed. The VWAP introduced by Berkowitz et al. can be calculated as the dollar amount traded for every transaction (price times shares traded) divided by the total shares traded for a given period. If the price of a buy order is lower than the VWAP, the trade is executed; if the price is higher, then the trade is not executed. Participants wishing to lower the market impact of their trades stress the importance of market volume. Market volume impact can be measured through comparing the execution price of an order to a benchmark. The VWAP benchmark is the sum of every transaction price paid, weighted by its volume. VWAP strategies allow the order to dilute the impact of orders through the day. Most institutional trading occurs in filling orders that exceed the daily volume. When large numbers of shares must be traded, liquidity concerns can affect price goals. For this reason, some firms offer multiday VWAP strategies to respond to customers’ requests. In order to further reduce the market impact of large orders, customers can specify their own volume participation by limiting the volume of their orders to coincide with low expected volume days. Each order is sliced into several days’ orders and then sent to a VWAP engine for the corresponding days. VWAP strategies fall into three categories: sell order to a broker-dealer who guarantees VWAP; cross the order at a future date at VWAP; or trade the order with the goal of achieving a price of VWAP or better.

The second algorithmic trading strategy is the time-weighted average price (TWAP). TWAP allows traders to slice a trade over a certain period of time, thus an order can be cut into several equal parts and be traded throughout the time period specified by the order. TWAP is used for orders which are not dependent on volume. TWAP can overcome obstacles such as fulfilling orders in illiquid stocks with unpredictable volume. Conversely, high-volume traders can also use TWAP to execute their orders over a specific time by slicing the order into several parts so that the impact of the execution does not significantly distort the market.

Yet, another type of algorithmic trading strategy is the implementation shortfall or the arrival price. The implementation shortfall is defined as the difference in return between a theoretical portfolio and an implemented portfolio. When deciding to buy or sell stocks during portfolio construction, a portfolio manager looks at the prevailing prices (decision prices). However, several factors can cause execution prices to be different from decision prices. This results in returns that differ from the portfolio manager’s expectations. Implementation shortfall is measured as the difference between the dollar return of a paper portfolio (paper return) where all shares are assumed to transact at the prevailing market prices at the time of the investment decision and the actual dollar return of the portfolio (real portfolio return). The main advantage of the implementation shortfall-based algorithmic system is to manage transactions costs (most notably market impact and timing risk) over the specified trading horizon while adapting to changing market conditions and prices.

The participation algorithm or volume participation algorithm is used to trade up to the order quantity using a rate of execution that is in proportion to the actual volume trading in the market. It is ideal for trading large orders in liquid instruments where controlling market impact is a priority. The participation algorithm is similar to the VWAP except that a trader can set the volume to a constant percentage of total volume of a given order. This algorithm can represent a method of minimizing supply and demand imbalances (Kendall Kim – Electronic and Algorithmic Trading Technology).

Smart order routing (SOR) algorithms allow a single order to exist simultaneously in multiple markets. They are critical for algorithmic execution models. It is highly desirable for algorithmic systems to have the ability to connect different markets in a manner that permits trades to flow quickly and efficiently from market to market. Smart routing algorithms provide full integration of information among all the participants in the different markets where the trades are routed. SOR algorithms allow traders to place large blocks of shares in the order book without fear of sending out a signal to other market participants. The algorithm matches limit orders and executes them at the midpoint of the bid-ask price quoted in different exchanges.

Handbook of Trading Strategies for Navigating and Profiting From Currency, Bond, Stock Markets

Bacteria’s Perception-Action Circle: Materiality of the Ontological. Thought of the Day 136.0

diatoms_in_the_ice

The unicellular organism has thin filaments protruding from its cell membrane, and in the absence of any stimuli, it simply wanders randomly around by changing between two characteristical movement patterns. One is performed by rotating the flagella counterclockwise. In that case, they form a bundle which pushes the cell forward along a curved path, a ‘run’ of random duration with these runs interchanging with ‘tumbles’ where the flagella shifts to clockwise rotation, making them work independently and hence moving the cell erratically around with small net displacement. The biased random walk now consists in the fact than in the presence of a chemical attractant, the runs happening to carry the cell closer to the attractant are extended, while runs in other directions are not. The sensation of the chemical attractant is performed temporally rather than spatially, because the cell moves too rapidly for concentration comparisons between its two ends to be possible. A chemical repellant in the environment gives rise to an analogous behavioral structure – now the biased random walk takes the cell away from the repellant. The bias saturates very quickly – which is what prevents the cell from continuing in a ‘false’ direction, because a higher concentration of attractant will now be needed to repeat the bias. The reception system has three parts, one detecting repellants such as leucin, the other detecting sugars, the third oxygen and oxygen-like substances.

Fig-4-Uexkull's-model-of-the-functional-cycle

The cell’s behavior forms a primitive, if full-fledged example of von Uexküll’s functional circle connecting specific perception signs and action signs. Functional circle behavior is thus no privilege for animals equipped with central nervous systems (CNS). Both types of signs involve categorization. First, the sensory receptors of the bacterium evidently are organized after categorization of certain biologically significant chemicals, while most chemicals that remain insignificant for the cell’s metabolism and survival are ignored. The self-preservation of metabolism and cell structure is hence the ultimate regulator which is supported by the perception-action cycles described. The categorization inherent in the very structure of the sensors is mirrored in the categorization of act types. Three act types are outlined: a null-action, composed of random running and tumbling, and two mirroring biased variants triggered by attractants and repellants, respectively. Moreover, a negative feed-back loop governed by quick satiation grants that the window of concentration shifts to which the cell is able to react appropriately is large – it so to speak calibrates the sensory system so that it does not remain blinded by one perception and does not keep moving the cell forward on in one selected direction. This adaptation of the system grants that it works in a large scale of different attractor/repellor concentrations. These simple signals at stake in the cell’s functional circle display an important property: at simple biological levels, the distinction between signs and perception vanish – that distinction is supposedly only relevant for higher CNS-based animals. Here, the signals are based on categorical perception – a perception which immediately categorizes the entity perceived and thus remains blind to internal differences within the category.

Pandemic e coli

The mechanism by which the cell identifies sugar, is partly identical to what goes on in human taste buds. Sensation of sugar gradients must, of course, differ from the consumption of it – while the latter, of course, destroys the sugar molecule, the former merely reads an ‘active site’ on the outside of the macromolecule. E . Coli – exactly like us – may be fooled by artificial sweeteners bearing the same ‘active site’ on their outer perimeter, even if being completely different chemicals (this is, of course, the secret behind such sweeteners, they are not sugars and hence do not enter the digestion process carrying the energy of carbohydrates). This implies that E . coli may be fooled. Bacteria may not lie, but a simpler process than lying (which presupposes two agents and the ability of being fooled) is, in fact, being fooled (presupposing, in turn, only one agent and an ambiguous environment). E . coli has the ability to categorize a series of sugars – but, by the same token, the ability to categorize a series of irrelevant substances along with them. On the one hand, the ability to recognize and categorize an object by a surface property only (due to the weak van der Waal-bonds and hydrogen bonds to the ‘active site’, in contrast to the strong covalent bonds holding the molecule together) facilitates perception economy and quick action adaptability. On the other hand, the economy involved in judging objects from their surface only has an unavoidable flip side: it involves the possibility of mistake, of being fooled by allowing impostors in your categorization. So in the perception-action circle of a bacterium, some of the self-regulatory stability of a metabolism involving categorized signal and action involvement with the surroundings form intercellular communication in multicellular organisms to reach out to complicated perception and communication in higher animals.

Infinite Sequences and Halting Problem. Thought of the Day 76.0

580376_f96f_2

In attempting to extend the notion of depth from finite strings to infinite sequences, one encounters a familiar phenomenon: the definitions become sharper (e.g. recursively invariant), but their intuitive meaning is less clear, because of distinctions (e.g. between infintely-often and almost-everywhere properties) that do not exist in the finite case.

An infinite sequence X is called strongly deep if at every significance level s, and for every recursive function f, all but finitely many initial segments Xn have depth exceeding f(n).

It is necessary to require the initial segments to be deep almost everywhere rather than infinitely often, because even the most trivial sequence has infinitely many deep initial segments Xn (viz. the segments whose lengths n are deep numbers).

It is not difficult to show that the property of strong depth is invariant under truth-table equivalence (this is the same as Turing equivalence in recursively bounded time, or via a total recursive operator), and that the same notion would result if the initial segments were required to be deep in the sense of receiving less than 2−s of their algorithmic probability from f(n)-fast programs. The characteristic sequence of the halting set K is an example of a strongly deep sequence.

A weaker definition of depth, also invariant under truth-table equivalence, is perhaps more analogous to that adopted for finite strings:

An infinite sequence X is weakly deep if it is not computable in recursively bounded time from any algorithmically random infinite sequence.

Computability in recursively bounded time is equivalent to two other properties, viz. truth-table reducibility and reducibility via a total recursive operator.

By contrast to the situation with truth-table reducibility, Péter Gacs has shown that every sequence is computable from (i.e. Turing reducible to) an algorithmically random sequence if no bound is imposed on the time. This is the infinite analog of far more obvious fact that every finite string is computable from an algorithmically random string (e.g. its minimal program).

Every strongly deep sequence is weakly deep, but by intermittently padding K with large blocks of zeros, one can construct a weakly deep sequence with infinitely many shallow initial segments.

Truth table reducibility to an algorithmically random sequence is equivalent to the property studied by Levin et. al. of being random with respect to some recursive measure. Levin calls sequences with this property “proper” or “complete” sequences, and views them as more realistic and interesting than other sequences because they are the typical outcomes of probabilistic or deterministic effective processes operating in recursively bounded time.

Weakly deep sequences arise with finite probability when a universal Turing machine (with one-way input and output tapes, so that it can act as a transducer of infinite sequences) is given an infinite coin toss sequence for input. These sequences are necessarily produced very slowly: the time to output the n’th digit being bounded by no recursive function, and the output sequence contains evidence of this slowness. Because they are produced with finite probability, such sequences can contain only finite information about the halting problem.

Quantum Informational Biochemistry. Thought of the Day 71.0

el_net2

A natural extension of the information-theoretic Darwinian approach for biological systems is obtained taking into account that biological systems are constituted in their fundamental level by physical systems. Therefore it is through the interaction among physical elementary systems that the biological level is reached after increasing several orders of magnitude the size of the system and only for certain associations of molecules – biochemistry.

In particular, this viewpoint lies in the foundation of the “quantum brain” project established by Hameroff and Penrose (Shadows of the Mind). They tried to lift quantum physical processes associated with microsystems composing the brain to the level of consciousness. Microtubulas were considered as the basic quantum information processors. This project as well the general project of reduction of biology to quantum physics has its strong and weak sides. One of the main problems is that decoherence should quickly wash out the quantum features such as superposition and entanglement. (Hameroff and Penrose would disagree with this statement. They try to develop models of hot and macroscopic brain preserving quantum features of its elementary micro-components.)

However, even if we assume that microscopic quantum physical behavior disappears with increasing size and number of atoms due to decoherence, it seems that the basic quantum features of information processing can survive in macroscopic biological systems (operating on temporal and spatial scales which are essentially different from the scales of the quantum micro-world). The associated information processor for the mesoscopic or macroscopic biological system would be a network of increasing complexity formed by the elementary probabilistic classical Turing machines of the constituents. Such composed network of processors can exhibit special behavioral signatures which are similar to quantum ones. We call such biological systems quantum-like. In the series of works Asano and others (Quantum Adaptivity in Biology From Genetics to Cognition), there was developed an advanced formalism for modeling of behavior of quantum-like systems based on theory of open quantum systems and more general theory of adaptive quantum systems. This formalism is known as quantum bioinformatics.

The present quantum-like model of biological behavior is of the operational type (as well as the standard quantum mechanical model endowed with the Copenhagen interpretation). It cannot explain physical and biological processes behind the quantum-like information processing. Clarification of the origin of quantum-like biological behavior is related, in particular, to understanding of the nature of entanglement and its role in the process of interaction and cooperation in physical and biological systems. Qualitatively the information-theoretic Darwinian approach supplies an interesting possibility of explaining the generation of quantum-like information processors in biological systems. Hence, it can serve as the bio-physical background for quantum bioinformatics. There is an intriguing point in the fact that if the information-theoretic Darwinian approach is right, then it would be possible to produce quantum information from optimal flows of past, present and anticipated classical information in any classical information processor endowed with a complex enough program. Thus the unified evolutionary theory would supply a physical basis to Quantum Information Biology.

Suspicion on Consciousness as an Immanent Derivative

Untitled

The category of the subject (like that of the object) has no place in an immanent world. There can be no transcendent, subjective essence. What, then, is the ontological status of a body and its attendant instance of consciousness? In what would it exist? Sanford Kwinter (conjuncted here) here offers:

It would exist precisely in the ever-shifting pattern of mixtures or composites: both internal ones – the body as a site marked and traversed by forces that converge upon it in continuous variation; and external ones – the capacity of any individuated substance to combine and recombine with other bodies or elements (ensembles), both influencing their actions and undergoing influence by them. The ‘subject’ … is but a synthetic unit falling at the midpoint or interface of two more fundamental systems of articulation: the first composed of the fluctuating microscopic relations and mixtures of which the subject is made up, the second of the macro-blocs of relations or ensembles into which it enters. The image produced at the interface of these two systems – that which replaces, yet is too often mistaken for, subjective essence – may in turn have its own individuality characterized with a certain rigor. For each mixture at this level introduces into the bloc a certain number of defining capacities that determine both what the ‘subject’ is capable of bringing to pass outside of itself and what it is capable of receiving (undergoing) in terms of effects.

This description is sufficient to explain the immanent nature of the subjective bloc as something entirely embedded in and conditioned by its surroundings. What it does not offer – and what is not offered in any detail in the entirety of the work – is an in-depth account of what, exactly, these “defining capacities” are. To be sure, it would be unfair to demand a complete description of these capacities. Kwinter himself has elsewhere referred to the states of the nervous system as “magically complex”. Regardless of the specificity with which these capacities can presently be defined, we must nonetheless agree that it is at this interface, as he calls it, at this location where so many systems are densely overlaid, that consciousness is produced. We may be convinced that this consciousness, this apparent internal space of thought, is derived entirely from immanent conditions and can only be granted the ontological status of an effect, but this effect still manages to produce certain difficulties when attempting to define modes of behavior appropriate to an immanent world.

There is a palpable suspicion of the role of consciousness throughout Kwinter’s work, at least insofar as it is equated with some kind of internal, subjective space. (In one text he optimistically awaits the day when this space will “be left utterly in shreds.”) The basis of this suspicion is multiple and obvious. Among the capacities of consciousness is the ability to attribute to itself the (false) image of a stable and transcendent essence. The workings of consciousness are precisely what allow the subjective bloc to orient itself in a sequence of time, separating itself from an absolute experience of the moment. It is within consciousness that limiting and arbitrary moral categories seem to most stubbornly lodge themselves. (To be sure this is the location of all critical thought.) And, above all, consciousness may serve as the repository for conditioned behaviors which believe themselves to be free of external determination. Consciousness, in short, contains within itself an enormous number of limiting factors which would retard the production of novelty. Insofar as it appears to possess the capacity for self-determination, this capacity would seem most productively applied by turning on itself – that is, precisely by making the choice not to make conscious decisions and instead to permit oneself to be seized by extra-subjective forces.

Potential Synapses. Thought of the Day 52.0

For a neuron to recognize a pattern of activity it requires a set of co-located synapses (typically fifteen to twenty) that connect to a subset of the cells that are active in the pattern to be recognized. Learning to recognize a new pattern is accomplished by the formation of a set of new synapses collocated on a dendritic segment.

Untitled

Figure: Learning by growing new synapses. Learning in an HTM neuron is modeled by the growth of new synapses from a set of potential synapses. A “permanence” value is assigned to each potential synapse and represents the growth of the synapse. Learning occurs by incrementing or decrementing permanence values. The synapse weight is a binary value set to 1 if the permanence is above a threshold.

Figure shows how we model the formation of new synapses in a simulated Hierarchical Temporal Memory (HTM) neuron. For each dendritic segment we maintain a set of “potential” synapses between the dendritic segment and other cells in the network that could potentially form a synapse with the segment. The number of potential synapses is larger than the number of actual synapses. We assign each potential synapse a scalar value called “permanence” which represents stages of growth of the synapse. A permanence value close to zero represents an axon and dendrite with the potential to form a synapse but that have not commenced growing one. A 1.0 permanence value represents an axon and dendrite with a large fully formed synapse.

The permanence value is incremented and decremented using a Hebbian-like rule. If the permanence value exceeds a threshold, such as 0.3, then the weight of the synapse is 1, if the permanence value is at or below the threshold then the weight of the synapse is 0. The threshold represents the establishment of a synapse, albeit one that could easily disappear. A synapse with a permanence value of 1.0 has the same effect as a synapse with a permanence value at threshold but is not as easily forgotten. Using a scalar permanence value enables on-line learning in the presence of noise. A previously unseen input pattern could be noise or it could be the start of a new trend that will repeat in the future. By growing new synapses, the network can start to learn a new pattern when it is first encountered, but only act differently after several presentations of the new pattern. Increasing permanence beyond the threshold means that patterns experienced more than others will take longer to forget.

HTM neurons and HTM networks rely on distributed patterns of cell activity, thus the activation strength of any one neuron or synapse is not very important. Therefore, in HTM simulations we model neuron activations and synapse weights with binary states. Additionally, it is well known that biological synapses are stochastic, so a neocortical theory cannot require precision of synaptic efficacy. Although scalar states and weights might improve performance, they are not required from a theoretical point of view.

US Stock Market Interaction Network as Learned by the Boltzmann Machine

Untitled

Price formation on a financial market is a complex problem: It reflects opinion of investors about true value of the asset in question, policies of the producers, external regulation and many other factors. Given the big number of factors influencing price, many of which unknown to us, describing price formation essentially requires probabilistic approaches. In the last decades, synergy of methods from various scientific areas has opened new horizons in understanding the mechanisms that underlie related problems. One of the popular approaches is to consider a financial market as a complex system, where not only a great number of constituents plays crucial role but also non-trivial interaction properties between them. For example, related interdisciplinary studies of complex financial systems have revealed their enhanced sensitivity to fluctuations and external factors near critical events with overall change of internal structure. This can be complemented by the research devoted to equilibrium and non-equilibrium phase transitions.

In general, statistical modeling of the state space of a complex system requires writing down the probability distribution over this space using real data. In a simple version of modeling, the probability of an observable configuration (state of a system) described by a vector of variables s can be given in the exponential form

p(s) = Z−1 exp {−βH(s)} —– (1)

where H is the Hamiltonian of a system, β is inverse temperature (further β ≡ 1 is assumed) and Z is a statistical sum. Physical meaning of the model’s components depends on the context and, for instance, in the case of financial systems, s can represent a vector of stock returns and H can be interpreted as the inverse utility function. Generally, H has parameters defined by its series expansion in s. Basing on the maximum entropy principle, expansion up to the quadratic terms is usually used, leading to the pairwise interaction models. In the equilibrium case, the Hamiltonian has form

H(s) = −hTs − sTJs —– (2)

where h is a vector of size N of external fields and J is a symmetric N × N matrix of couplings (T denotes transpose). The energy-based models represented by (1) play essential role not only in statistical physics but also in neuroscience (models of neural networks) and machine learning (generative models, also known as Boltzmann machines). Given topological similarities between neural and financial networks, these systems can be considered as examples of complex adaptive systems, which are characterized by the adaptation ability to changing environment, trying to stay in equilibrium with it. From this point of view, market structural properties, e.g. clustering and networks, play important role for modeling of the distribution of stock prices. Adaptation (or learning) in these systems implies change of the parameters of H as financial and economic systems evolve. Using statistical inference for the model’s parameters, the main goal is to have a model capable of reproducing the same statistical observables given time series for a particular historical period. In the pairwise case, the objective is to have

⟨sidata = ⟨simodel —– (3a)

⟨sisjdata = ⟨sisjmodel —– (3b)

where angular brackets denote statistical averaging over time. Having specified general mathematical model, one can also discuss similarities between financial and infinite- range magnetic systems in terms of phenomena related, e.g. extensivity, order parameters and phase transitions, etc. These features can be captured even in the simplified case, when si is a binary variable taking only two discrete values. Effect of the mapping to a binarized system, when the values si = +1 and si = −1 correspond to profit and loss respectively. In this case, diagonal elements of the coupling matrix, Jii, are zero because s2i = 1 terms do not contribute to the Hamiltonian….

US stock market interaction network as learned by the Boltzmann Machine

Algorithmic Randomness and Complexity

Figure-13-Constructing-a-network-of-motifs-After-six-recursive-iterations-starting-from

How random is a real? Given two reals, which is more random? How should we even try to quantify these questions, and how do various choices of measurement relate? Once we have reasonable measuring devices, and, using these devices, we divide the reals into equivalence classes of the same “degree of randomness” what do the resulting structures look like? Once we measure the level of randomness how does the level of randomness relate to classical measures of complexity Turing degrees of unsolvability? Should it be the case that high levels of randomness mean high levels of complexity in terms of computational power, or low levels of complexity? Conversely should the structures of computability such as the degrees and the computably enumerable sets have anything to say about randomness for reals?

Algorithmic Randomness and Complexity