Quantum Informational Biochemistry. Thought of the Day 71.0

el_net2

A natural extension of the information-theoretic Darwinian approach for biological systems is obtained taking into account that biological systems are constituted in their fundamental level by physical systems. Therefore it is through the interaction among physical elementary systems that the biological level is reached after increasing several orders of magnitude the size of the system and only for certain associations of molecules – biochemistry.

In particular, this viewpoint lies in the foundation of the “quantum brain” project established by Hameroff and Penrose (Shadows of the Mind). They tried to lift quantum physical processes associated with microsystems composing the brain to the level of consciousness. Microtubulas were considered as the basic quantum information processors. This project as well the general project of reduction of biology to quantum physics has its strong and weak sides. One of the main problems is that decoherence should quickly wash out the quantum features such as superposition and entanglement. (Hameroff and Penrose would disagree with this statement. They try to develop models of hot and macroscopic brain preserving quantum features of its elementary micro-components.)

However, even if we assume that microscopic quantum physical behavior disappears with increasing size and number of atoms due to decoherence, it seems that the basic quantum features of information processing can survive in macroscopic biological systems (operating on temporal and spatial scales which are essentially different from the scales of the quantum micro-world). The associated information processor for the mesoscopic or macroscopic biological system would be a network of increasing complexity formed by the elementary probabilistic classical Turing machines of the constituents. Such composed network of processors can exhibit special behavioral signatures which are similar to quantum ones. We call such biological systems quantum-like. In the series of works Asano and others (Quantum Adaptivity in Biology From Genetics to Cognition), there was developed an advanced formalism for modeling of behavior of quantum-like systems based on theory of open quantum systems and more general theory of adaptive quantum systems. This formalism is known as quantum bioinformatics.

The present quantum-like model of biological behavior is of the operational type (as well as the standard quantum mechanical model endowed with the Copenhagen interpretation). It cannot explain physical and biological processes behind the quantum-like information processing. Clarification of the origin of quantum-like biological behavior is related, in particular, to understanding of the nature of entanglement and its role in the process of interaction and cooperation in physical and biological systems. Qualitatively the information-theoretic Darwinian approach supplies an interesting possibility of explaining the generation of quantum-like information processors in biological systems. Hence, it can serve as the bio-physical background for quantum bioinformatics. There is an intriguing point in the fact that if the information-theoretic Darwinian approach is right, then it would be possible to produce quantum information from optimal flows of past, present and anticipated classical information in any classical information processor endowed with a complex enough program. Thus the unified evolutionary theory would supply a physical basis to Quantum Information Biology.

Fundamental Theorem of Asset Pricing: Tautological Meeting of Mathematical Martingale and Financial Arbitrage by the Measure of Probability.

thinkstockphotos-496599823

The Fundamental Theorem of Asset Pricing (FTAP hereafter) has two broad tenets, viz.

1. A market admits no arbitrage, if and only if, the market has a martingale measure.

2. Every contingent claim can be hedged, if and only if, the martingale measure is unique.

The FTAP is a theorem of mathematics, and the use of the term ‘measure’ in its statement places the FTAP within the theory of probability formulated by Andrei Kolmogorov (Foundations of the Theory of Probability) in 1933. Kolmogorov’s work took place in a context captured by Bertrand Russell, who observed that

It is important to realise the fundamental position of probability in science. . . . As to what is meant by probability, opinions differ.

In the 1920s the idea of randomness, as distinct from a lack of information, was becoming substantive in the physical sciences because of the emergence of the Copenhagen Interpretation of quantum mechanics. In the social sciences, Frank Knight argued that uncertainty was the only source of profit and the concept was pervading John Maynard Keynes’ economics (Robert Skidelsky Keynes the return of the master).

Two mathematical theories of probability had become ascendant by the late 1920s. Richard von Mises (brother of the Austrian economist Ludwig) attempted to lay down the axioms of classical probability within a framework of Empiricism, the ‘frequentist’ or ‘objective’ approach. To counter–balance von Mises, the Italian actuary Bruno de Finetti presented a more Pragmatic approach, characterised by his claim that “Probability does not exist” because it was only an expression of the observer’s view of the world. This ‘subjectivist’ approach was closely related to the less well-known position taken by the Pragmatist Frank Ramsey who developed an argument against Keynes’ Realist interpretation of probability presented in the Treatise on Probability.

Kolmogorov addressed the trichotomy of mathematical probability by generalising so that Realist, Empiricist and Pragmatist probabilities were all examples of ‘measures’ satisfying certain axioms. In doing this, a random variable became a function while an expectation was an integral: probability became a branch of Analysis, not Statistics. Von Mises criticised Kolmogorov’s generalised framework as un-necessarily complex. About a decade and a half back, the physicist Edwin Jaynes (Probability Theory The Logic Of Science) champions Leonard Savage’s subjectivist Bayesianism as having a “deeper conceptual foundation which allows it to be extended to a wider class of applications, required by current problems of science”.

The objections to measure theoretic probability for empirical scientists can be accounted for as a lack of physicality. Frequentist probability is based on the act of counting; subjectivist probability is based on a flow of information, which, following Claude Shannon, is now an observable entity in Empirical science. Measure theoretic probability is based on abstract mathematical objects unrelated to sensible phenomena. However, the generality of Kolmogorov’s approach made it flexible enough to handle problems that emerged in physics and engineering during the Second World War and his approach became widely accepted after 1950 because it was practically more useful.

In the context of the first statement of the FTAP, a ‘martingale measure’ is a probability measure, usually labelled Q, such that the (real, rather than nominal) price of an asset today, X0, is the expectation, using the martingale measure, of its (real) price in the future, XT. Formally,

X0 = EQ XT

The abstract probability distribution Q is defined so that this equality exists, not on any empirical information of historical prices or subjective judgement of future prices. The only condition placed on the relationship that the martingale measure has with the ‘natural’, or ‘physical’, probability measures usually assigned the label P, is that they agree on what is possible.

The term ‘martingale’ in this context derives from doubling strategies in gambling and it was introduced into mathematics by Jean Ville in a development of von Mises’ work. The idea that asset prices have the martingale property was first proposed by Benoit Mandelbrot in response to an early formulation of Eugene Fama’s Efficient Market Hypothesis (EMH), the two concepts being combined by Fama. For Mandelbrot and Fama the key consequence of prices being martingales was that the current price was independent of the future price and technical analysis would not prove profitable in the long run. In developing the EMH there was no discussion on the nature of the probability under which assets are martingales, and it is often assumed that the expectation is calculated under the natural measure. While the FTAP employs modern terminology in the context of value-neutrality, the idea of equating a current price with a future, uncertain, has ethical ramifications.

The other technical term in the first statement of the FTAP, arbitrage, has long been used in financial mathematics. Liber Abaci Fibonacci (Laurence Sigler Fibonaccis Liber Abaci) discusses ‘Barter of Merchandise and Similar Things’, 20 arms of cloth are worth 3 Pisan pounds and 42 rolls of cotton are similarly worth 5 Pisan pounds; it is sought how many rolls of cotton will be had for 50 arms of cloth. In this case there are three commodities, arms of cloth, rolls of cotton and Pisan pounds, and Fibonacci solves the problem by having Pisan pounds ‘arbitrate’, or ‘mediate’ as Aristotle might say, between the other two commodities.

Within neo-classical economics, the Law of One Price was developed in a series of papers between 1954 and 1964 by Kenneth Arrow, Gérard Debreu and Lionel MacKenzie in the context of general equilibrium, in particular the introduction of the Arrow Security, which, employing the Law of One Price, could be used to price any asset. It was on this principle that Black and Scholes believed the value of the warrants could be deduced by employing a hedging portfolio, in introducing their work with the statement that “it should not be possible to make sure profits” they were invoking the arbitrage argument, which had an eight hundred year history. In the context of the FTAP, ‘an arbitrage’ has developed into the ability to formulate a trading strategy such that the probability, under a natural or martingale measure, of a loss is zero, but the probability of a positive profit is not.

To understand the connection between the financial concept of arbitrage and the mathematical idea of a martingale measure, consider the most basic case of a single asset whose current price, X0, can take on one of two (present) values, XTD < XTU, at time T > 0, in the future. In this case an arbitrage would exist if X0 ≤ XTD < XTU: buying the asset now, at a price that is less than or equal to the future pay-offs, would lead to a possible profit at the end of the period, with the guarantee of no loss. Similarly, if XTD < XTU ≤ X0, short selling the asset now, and buying it back would also lead to an arbitrage. So, for there to be no arbitrage opportunities we require that

XTD < X0 < XTU

This implies that there is a number, 0 < q < 1, such that

X0 = XTD + q(XTU − XTD)

= qXTU + (1−q)XTD

The price now, X0, lies between the future prices, XTU and XTD, in the ratio q : (1 − q) and represents some sort of ‘average’. The first statement of the FTAP can be interpreted simply as “the price of an asset must lie between its maximum and minimum possible (real) future price”.

If X0 < XTD ≤ XTU we have that q < 0 whereas if XTD ≤ XTU < X0 then q > 1, and in both cases q does not represent a probability measure which by Kolmogorov’s axioms, must lie between 0 and 1. In either of these cases an arbitrage exists and a trader can make a riskless profit, the market involves ‘turpe lucrum’. This account gives an insight as to why James Bernoulli, in his moral approach to probability, considered situations where probabilities did not sum to 1, he was considering problems that were pathological not because they failed the rules of arithmetic but because they were unfair. It follows that if there are no arbitrage opportunities then quantity q can be seen as representing the ‘probability’ that the XTU price will materialise in the future. Formally

X0 = qXTU + (1−q) XTD ≡ EQ XT

The connection between the financial concept of arbitrage and the mathematical object of a martingale is essentially a tautology: both statements mean that the price today of an asset must lie between its future minimum and maximum possible value. This first statement of the FTAP was anticipated by Frank Ramsey when he defined ‘probability’ in the Pragmatic sense of ‘a degree of belief’ and argues that measuring ‘degrees of belief’ is through betting odds. On this basis he formulates some axioms of probability, including that a probability must lie between 0 and 1. He then goes on to say that

These are the laws of probability, …If anyone’s mental condition violated these laws, his choice would depend on the precise form in which the options were offered him, which would be absurd. He could have a book made against him by a cunning better and would then stand to lose in any event.

This is a Pragmatic argument that identifies the absence of the martingale measure with the existence of arbitrage and today this forms the basis of the standard argument as to why arbitrages do not exist: if they did the, other market participants would bankrupt the agent who was mis-pricing the asset. This has become known in philosophy as the ‘Dutch Book’ argument and as a consequence of the fact/value dichotomy this is often presented as a ‘matter of fact’. However, ignoring the fact/value dichotomy, the Dutch book argument is an alternative of the ‘Golden Rule’– “Do to others as you would have them do to you.”– it is infused with the moral concepts of fairness and reciprocity (Jeffrey Wattles The Golden Rule).

FTAP is the ethical concept of Justice, capturing the social norms of reciprocity and fairness. This is significant in the context of Granovetter’s discussion of embeddedness in economics. It is conventional to assume that mainstream economic theory is ‘undersocialised’: agents are rational calculators seeking to maximise an objective function. The argument presented here is that a central theorem in contemporary economics, the FTAP, is deeply embedded in social norms, despite being presented as an undersocialised mathematical object. This embeddedness is a consequence of the origins of mathematical probability being in the ethical analysis of commercial contracts: the feudal shackles are still binding this most modern of economic theories.

Ramsey goes on to make an important point

Having any definite degree of belief implies a certain measure of consistency, namely willingness to bet on a given proposition at the same odds for any stake, the stakes being measured in terms of ultimate values. Having degrees of belief obeying the laws of probability implies a further measure of consistency, namely such a consistency between the odds acceptable on different propositions as shall prevent a book being made against you.

Ramsey is arguing that an agent needs to employ the same measure in pricing all assets in a market, and this is the key result in contemporary derivative pricing. Having identified the martingale measure on the basis of a ‘primal’ asset, it is then applied across the market, in particular to derivatives on the primal asset but the well-known result that if two assets offer different ‘market prices of risk’, an arbitrage exists. This explains why the market-price of risk appears in the Radon-Nikodym derivative and the Capital Market Line, it enforces Ramsey’s consistency in pricing. The second statement of the FTAP is concerned with incomplete markets, which appear in relation to Arrow-Debreu prices. In mathematics, in the special case that there are as many, or more, assets in a market as there are possible future, uncertain, states, a unique pricing vector can be deduced for the market because of Cramer’s Rule. If the elements of the pricing vector satisfy the axioms of probability, specifically each element is positive and they all sum to one, then the market precludes arbitrage opportunities. This is the case covered by the first statement of the FTAP. In the more realistic situation that there are more possible future states than assets, the market can still be arbitrage free but the pricing vector, the martingale measure, might not be unique. The agent can still be consistent in selecting which particular martingale measure they choose to use, but another agent might choose a different measure, such that the two do not agree on a price. In the context of the Law of One Price, this means that we cannot hedge, replicate or cover, a position in the market, such that the portfolio is riskless. The significance of the second statement of the FTAP is that it tells us that in the sensible world of imperfect knowledge and transaction costs, a model within the framework of the FTAP cannot give a precise price. When faced with incompleteness in markets, agents need alternative ways to price assets and behavioural techniques have come to dominate financial theory. This feature was realised in The Port Royal Logic when it recognised the role of transaction costs in lotteries.

Feynman Path Integrals, Trajectories and Copenhagen Interpretation. Note Quote.

As the trajectory exists by precept in the trajectory representation, there is no need for Copenhagen’s collapse of the wave function. The trajectory representation can describe an individual particle. On the other hand, Copenhagen describes an ensemble of particles while only rendering probabilities for individual particles.

The trajectory representation renders microstates of the Schrödinger’s wave function for the bound state problem. Each microstate by the equation

ψ = (2m)1/4cos(W/h ̄)/(W′)1/2[a − c2/(4b)]1/2

(aφ2 + bθ2 + cφθ)1/2/[a − c2/(4b)]1/2 cos[arctan(b(θ/φ) + c/2)/(ab − c2/4)1/2 = φ

is sufficient by itself to determine the Schrödinger’s wave function. Thus, the existence of microstates is a counter example refuting the Copenhagen assertion that the Schrödinger’s wave function be an exhaustive description of non-relativistic quantum phenomenon. The trajectory representation is deterministic. We can now identify a trajectory and its corresponding Schrödinger wave function with sub-barrier energy that tunnels through the barrier with certainty. Hence, tunneling with certainty is a counter example refuting Born’s postulate of the Copenhagen interpretation that attributes a probability amplitude to the Schrödinger’s wave function. As the trajectory representation is deterministic and does not need ψ, much less to assign a probability amplitude to it, the trajectory representation does not need a wave packet to describe or localize a particle. The equation of motion,

t − τ = ∂W/∂E, where t is the trajectory time, relative to its constant coordinate τ, and given as a function of x;

for a particle (monochromatic wave) has been shown to be consistent with the group velocity of the wave packet. Normalization, as previously noted herein, is determined by the nonlinearity of the generalized Hamilton-Jacobi equation for the trajectory representation and for the Copenhagen interpretation by the probability of finding the particle in space being unity. Though probability is not needed for tunneling through a barrier, the trajectory interpretation for tunneling is still consistent with the Schrödinger representation without the Copenhagen interpretation. The incident wave with compound spatial modulation of amplitude and phase for the trajectory representation,

img_20170301_201827_hdr

has only two spectral components which are the incident and reflected unmodulated waves of the Schrödinger representation.

Trajectories differ with Feynman’s path integrals in three ways. First, trajectories employ a quantum Hamilton’s characteristic function while a path integral is based upon a classical Hamilton’s characteristic function. Second, the quantum Hamilton’s characteristic function is determined uniquely by the initial values of the quantum stationary Hamilton-Jacobi equation, while path integrals are democratic summing over all possible classical paths to determine Feynman’s amplitude. While path integrals need an infinite number of constants of the motion even for a single particle in one dimension, motion in the trajectory representation for a finite number of particles in finite dimensions is always determined by only a finite number of constants of the motion. Third, trajectories are well defined in classically forbidden regions where path integrals are not defined by precept.

Heisenberg’s uncertainty principle shall remain premature as long as Copenhagen uses an insufficient subset of initial conditions (x, p) to describe quantum phenomena. Bohr’s complementarity postulates that the wave-particle duality be resolved consistent with the measuring instrument’s specific properties.

Heisenberg’s uncertainty principle shall remain premature as long as Copenhagen uses an insufficient subset of initial conditions (x, p) to describe quantum phenomena. Bohr’s complementarity postulates that the wave-particle duality be resolved consistent with the measuring instrument’s specific properties. Anonymous referees of the Copenhagen school have had reservations concerning the representation of the incident modulated wave as represented by the equation

img_20170301_203103_hdr

before the barrier. They have reported that compoundly modulated wave represented by the above equation is only a clever superposition of the incident and reflected unmodulated plane waves. They have concluded that synthesizing a running wave with compound spatial modulation from its spectral components is nonphysical because it would spontaneously split. By the superposition principle of linear differential equations, the spectral components may be used to synthesize a new pair of independent solutions with compound modulations running in opposite directions. Likewise, an unmodulated plane wave running in one direction can be synthesized from two waves with compound modulation running in the opposite directions for mappings under the superposition principle are reversible.

Representation as a Meaningful Philosophical Quandary

1456831690974

The deliberation on representation indeed becomes a meaningful quandary, if most of the shortcomings are to be overcome, without actually accepting the way they permeate the scientific and philosophical discourse. The problem is more ideological than one could have imagined, since, it is only within the space of this quandary that one can assume success in overthrowing the quandary. Unless the classical theory of representation that guides the expert systems has been accepted as existing, there is no way to dislodge the relationship of symbols and meanings that build up such systems, lest the predicament of falling prey to the Scylla of metaphysically strong notion of meaningful representation as natural or the Charybdis of an external designer should gobble us up. If one somehow escapes these maliciously aporetic entities, representation as a metaphysical monster stands to block our progress. Is it really viable then to think of machines that can survive this representational foe, a foe that gets no aid from the clusters of internal mechanisms? The answer is very much in the affirmative, provided, a consideration of the sort of such a non-representational system as continuous and homogeneous is done away with. And in its place is had functional units that are no more representational ones, for the former derive their efficiency and legitimacy through autopoiesis. What is required is to consider this notional representational critique of distributed systems on the objectivity of science, since objectivity as a property of science has an intrinsic value of independence from the subject who studies the discipline. Kuhn  had some philosophical problems to this precise way of treating science as an objective discipline. For Kuhn, scientists operate under or within paradigms thus obligating hierarchical structures. Such hierarchical structures ensure the position of scientists to voice their authority on matters of dispute, and when there is a crisis within, or, for the paradigm, scientists, to begin with, do not outrightly reject the paradigm, but try their level best at resolution of the same. In cases where resolution becomes a difficult task, an outright rejection of the paradigm would follow suit, thus effecting what is commonly called the paradigm shift. If such were the case, obviously, the objective tag for science goes for a hit, and Kuhn argues in favor of a shift in social order that science undergoes, signifying the subjective element. Importantly, these paradigm shifts occur to benefit scientific progress and in almost all of the cases, occur non-linearly. Such a view no doubt slides Kuhn into a position of relativism, and has been the main point of attack on paradigms shifting. At the forefront of attacks has been Michael Polanyi and his bunch of supporters, whose work on epistemology of science have much of the same ingredients, but was eventually deprived of fame. Kuhn was charged with plagiarism. The commonality of their arguments could be measured by a dissenting voice for objectivity in science. Polanyi thought of it as a false ideal, since for him the epistemological claims that defined science were based more on personal judgments, and therefore susceptible to fallibilism. The objective nature of science that obligates the scientists to see things as they really are is kind of dislodged by the above principle of subjectivity. But, if science were to be seen as objective, then the human subjectivity would indeed create a rupture as far as the purified version of scientific objectivity is sought for. The subject or the observer undergoes what is termed the “observer effect” that refers to the change impacting an act of observation being observed. This effect is as good as ubiquitous in most of the domains of science and technology ranging from Heisenbug(1) in computing via particle physics, science of thermodynamics to quantum mechanics. The quantum mechanics observer effect is quite perplexing, and is a result of a phenomenon called “superposition” that signifies the existence in all possible states and all at once. The superposition gets its credit due to Schrödinger’s cat experiment. The experiment entails a cat that is neither dead nor alive until observed. This has led physicists to take into account the acts of “observation” and “measurement” to comprehend the paradox in question, and thereby come out resolving it. But there is still a minority of quantum physicists out there who vouch for the supremacy of an observer, despite the quantum entanglement effect that go on to explain “observation” and “measurement” impacts.(2) Such a standpoint is indeed reflected in Derrida (9-10) as well, when he says (I quote him in full),

The modern dominance of the principle of reason had to go hand in hand with the interpretation of the essence of beings as objects, and object present as representation (Vorstellung), an object placed and positioned before a subject. This latter, a man who says ‘I’, an ego certain of itself, thus ensures his own technical mastery over the totality of what is. The ‘re-‘ of repraesentation also expresses the movement that accounts for – ‘renders reason to’ – a thing whose presence is encountered by rendering it present, by bringing it to the subject of representation, to the knowing self.

If Derridean deconstruction needs to work on science and theory, the only way out is to relinquish the boundaries that define or divide the two disciplines. Moreover, if there is any looseness encountered in objectivity, the ramifications are felt straight at the levels of scientific activities. Even theory does not remain immune to these consequences. Importantly, as scientific objectivity starts to wane, a corresponding philosophical luxury of avoiding the contingent wanes. Such a loss of representation congruent with a certain theory of meaning we live by has serious ethical-political affectations.

(1) Heisenbug is a pun on the Heisenberg’s uncertainty principle and is a bug in computing that is characterized by a disappearance of the bug itself when an attempt is made to study it. One common example is a bug that occurs in a program that was compiled with an optimizing compiler, but not in the same program when compiled without optimization (e.g., for generating a debug-mode version). Another example is a bug caused by a race condition. A heisenbug may also appear in a system that does not conform to the command-query separation design guideline, since a routine called more than once could return different values each time, generating hard- to-reproduce bugs in a race condition scenario. One common reason for heisenbug-like behaviour is that executing a program in debug mode often cleans memory before the program starts, and forces variables onto stack locations, instead of keeping them in registers. These differences in execution can alter the effect of bugs involving out-of-bounds member access, incorrect assumptions about the initial contents of memory, or floating- point comparisons (for instance, when a floating-point variable in a 32-bit stack location is compared to one in an 80-bit register). Another reason is that debuggers commonly provide watches or other user interfaces that cause additional code (such as property accessors) to be executed, which can, in turn, change the state of the program. Yet another reason is a fandango on core, the effect of a pointer running out of bounds. In C++, many heisenbugs are caused by uninitialized variables. Another similar pun intended bug encountered in computing is the Schrödinbug. A schrödinbug is a bug that manifests only after someone reading source code or using the program in an unusual way notices that it never should have worked in the first place, at which point the program promptly stops working for everybody until fixed. The Jargon File adds: “Though… this sounds impossible, it happens; some programs have harbored latent schrödinbugs for years.”

(2) There is a related issue in quantum mechanics relating to whether systems have pre-existing – prior to measurement, that is – properties corresponding to all measurements that could possibly be made on them. The assumption that they do is often referred to as “realism” in the literature, although it has been argued that the word “realism” is being used in a more restricted sense than philosophical realism. A recent experiment in the realm of quantum physics has been quoted as meaning that we have to “say goodbye” to realism, although the author of the paper states only that “we would [..] have to give up certain intuitive features of realism”. These experiments demonstrate a puzzling relationship between the act of measurement and the system being measured, although it is clear from experiment that an “observer” consisting of a single electron is sufficient – the observer need not be a conscious observer. Also, note that Bell’s Theorem suggests strongly that the idea that the state of a system exists independently of its observer may be false. Note that the special role given to observation (the claim that it affects the system being observed, regardless of the specific method used for observation) is a defining feature of the Copenhagen Interpretation of quantum mechanics. Other interpretations resolve the apparent paradoxes from experimental results in other ways. For instance, the Many- Worlds Interpretation posits the existence of multiple universes in which an observed system displays all possible states to all possible observers. In this model, observation of a system does not change the behavior of the system – it simply answers the question of which universe(s) the observer(s) is(are) located in: In some universes the observer would observe one result from one state of the system, and in others the observer would observe a different result from a different state of the system.

Quantum Entanglement, Post-Selection and Time Travel

If Copenhagen interpretation of quantum mechanics is to be believed, nothing exists in reality until a measurement is carried out. In the double slit experiment carried out by John Wheeler, post-selection can be made to work, after the experiment is finished, and that by delaying the observation after the photon has purportedly passed through the slits. Now, if post-selection is to work, there must be a change in the properties in the past. This has been experimentally proved by physicists like Jean-François Roch at the Ecole Normale Supérieure in Cachan, France. This is weird, but invoking the quantum entanglement and throwing it up for grabs against the philosophic principle of causality surprises. If the experimental set up impacts the future course of outcome, quantum particles in a most whimsical manner are susceptible to negate it. This happens due to the mathematics governing these particle, which enable or rather disable them to differentiate between the course of sense they are supposed to undertake. In short, what happens in the future could determine the past….

….If particles are caught up in quantum entanglement, the measurement of one immediately affects the other, some kind of a Einsteinian spooky action at a distance.

 A weird connection was what sprang up in my mind this morning, and the vestibule comes from French theoretical consideration. Without any kind of specificity, the knower and the known are crafted together by a meditation that rides on instability populated by discursive and linguistic norms and forms that is derided as secondary in the analytical tradition. The autonomy of the knower as against the known is questionable, and derives significance only when its trajectory is mapped by a simultaneity put forth by the known.
Does this not imply French theory getting close to interpreting quantum mechanics? just shocking weird….
Anyways, adieu to this and still firmed up in this post from last week.