Metaphysics of the Semantics of HoTT. Thought of the Day 73.0

PMquw

Types and tokens are interpreted as concepts (rather than spaces, as in the homotopy interpretation). In particular, a type is interpreted as a general mathematical concept, while a token of a given type is interpreted as a more specific mathematical concept qua instance of the general concept. This accords with the fact that each token belongs to exactly one type. Since ‘concept’ is a pre-mathematical notion, this interpretation is admissible as part of an autonomous foundation for mathematics.

Expressions in the language are the names of types and tokens. Those naming types correspond to propositions. A proposition is ‘true’ just if the corresponding type is inhabited (i.e. there is a token of that type, which we call a ‘certificate’ to the proposition). There is no way in the language of HoTT to express the absence or non-existence of a token. The negation of a proposition P is represented by the type P → 0, where P is the type corresponding to proposition P and 0 is a type that by definition has no token constructors (corresponding to a contradiction). The logic of HoTT is not bivalent, since the inability to construct a token of P does not guarantee that a token of P → 0 can be constructed, and vice versa.

The rules governing the formation of types are understood as ways of composing concepts to form more complex concepts, or as ways of combining propositions to form more complex propositions. They follow from the Curry-Howard correspondence between logical operations and operations on types. However, we depart slightly from the standard presentation of the Curry-Howard correspondence, in that the tokens of types are not to be thought of as ‘proofs’ of the corresponding propositions but rather as certificates to their truth. A proof of a proposition is the construction of a certificate to that proposition by a sequence of applications of the token construction rules. Two different such processes can result in construction of the same token, and so proofs and tokens are not in one-to-one correspondence.

When we work formally in HoTT we construct expressions in the language according to the formal rules. These expressions are taken to be the names of tokens and types of the theory. The rules are chosen such that if a construction process begins with non-contradictory expressions that all name tokens (i.e. none of the expressions are ‘empty names’) then the result will also name a token (i.e. the rules preserve non-emptiness of names).

Since we interpret tokens and types as concepts, the only metaphysical commitment required is to the existence of concepts. That human thought involves concepts is an uncontroversial position, and our interpretation does not require that concepts have any greater metaphysical status than is commonly attributed to them. Just as the existence of a concept such as ‘unicorn’ does not require the existence of actual unicorns, likewise our interpretation of tokens and types as mathematical concepts does not require the existence of mathematical objects. However, it is compatible with such beliefs. Thus a Platonist can take the concept, say, ‘equilateral triangle’ to be the concept corresponding to the abstract equilateral triangle (after filling in some account of how we come to know about these abstract objects in a way that lets us form the corresponding concepts). Even without invoking mathematical objects to be the ‘targets’ of mathematical concepts, one could still maintain that concepts have a mind-independent status, i.e. that the concept ‘triangle’ continues to exist even while no-one is thinking about triangles, and that the concept ‘elliptic curve’ did not come into existence at the moment someone first gave the definition. However, this is not a necessary part of the interpretation, and we could instead take concepts to be mind-dependent, with corresponding implications for the status of mathematics itself.

Advertisement

Conjuncted: Avarice

Greed followed by avarice….We consider the variation in which events occur at a rate equal to the difference in capital of the two traders. That is, an individual is more likely to take capital from a much poorer person rather than from someone of slightly less wealth. For this “avaricious” exchange, the corresponding rate equations are

dck/dt = ck-1j=1k-1(k – 1 – j)cj + ck+1j=k+1(j – k – 1)cj – ckj=1|k – j|cj —– (1)

while the total density obeys,

dN/dt = -c1(1 – N) —– (2)

under the assumption that the total wealth density is set equal to one, ∑kck = 1

These equations can be solved by again applying scaling. For this purpose, it is first expedient to rewrite the rate equation as,

dck/dt = (ck-1 – ck)∑j=1k-1(k – j)cj – ck-1j=1k-1cj + (ck+1 – ck)∑j=k+1(j – k)cj – ck+1j=k+1cj —– (3)

taking the continuum limits

∂c/∂t = ∂c/∂k – N∂/∂k(kc) —— (3)

We now substitute the scaling ansatz,

ck(t) ≅ N2C(x), with x = kN to yield

C(0)[2C + xC′] = (x − 1)C′ + C —– (4)

and

dN/dt = -C(0)N2 —– (5)

Solving the above equations gives N ≅ [C(0)t]−1 and

C(x) = (1 + μ)(1 + μx)−2−1/μ —– (6)

with μ = C(0) − 1. The scaling approach has thus found a family of solutions which are parameterized by μ, and additional information is needed to determine which of these solutions is appropriate for our system. For this purpose, note that equation (6) exhibits different behaviors depending on the sign of μ. When μ > 0, there is an extended non-universal power-law distribution, while for μ = 0 the solution is the pure exponential, C(x) = e−x. These solutions may be rejected because the wealth distribution cannot extend over an unbounded domain if the initial wealth extends over a finite range.

The accessible solutions therefore correspond to −1 < μ < 0, where the distribution is compact and finite, with C(x) ≡ 0 for x ≥ xf = −μ−1. To determine the true solution, let us re-examine the continuum form of the rate equation, equation (3). From naive power counting, the first two terms are asymptotically dominant and they give a propagating front with kf exactly equal to t. Consequently, the scaled location of the front is given by xf = Nkf. Now the result N ≃ [C(0)t]−1 gives xf = 1/C(0). Comparing this expression with the corresponding value from the scaling approach, xf = [1 − C(0)]−1, selects the value C(0) = 1/2. Remarkably, this scaling solution coincides with the Fermi distribution that found for the case of constant interaction rate. Finally, in terms of the unscaled variables k and t, the wealth distribution is

ck(t) = 4/t2, k < t

= 0, k ≥ 0 —– (7)

This discontinuity is smoothed out by diffusive spreading. Another interesting feature is that if the interaction rate is sufficiently greedy, “gelation” occurs, whereby a finite fraction of the total capital is possessed by a single individual. For interaction rates, or kernels K(j, k) between individuals of capital j and k which do not give rise to gelation, the total density typically varies as a power law in time, while for gelling kernels N(t) goes to zero at some finite time. At the border between these regimes N(t) typically decays exponentially in time. We seek a similar transition in behavior for the capital exchange model by considering the rate equation for the density

dN/dt = -c1k=1k(1, k)ck —– (8)

For the family of kernels with K(1, k) ∼ kν as k → ∞, substitution of the scaling ansatz gives N ̇ ∼ −N3−ν. Thus N(t) exhibits a power-law behavior N ∼ t1/(2−ν) for ν < 2 and an exponential behavior for ν = 2. Thus gelation should arise for ν > 2.

Greed

In greedy exchange, when two individuals meet, the richer person takes one unit of capital from the poorer person, as represented by the reaction scheme (j, k) → (j + 1, k − 1) for j ≥ k. In the rate equation approximation, the densities ck(t) now evolve according to

dck/dt = ck-1j=1k-1cj + ck+1j=k+1cj – ckN – c2k —– (1)

The first two terms account for the gain in ck(t) due to the interaction between pairs of individuals of capitals (j, k−1), with j k, respectively. The last two terms correspondingly account for the loss of ck(t). One can check that the wealth density M1 ≡ ∑k=1 k ck(t) is conserved, and that the population density obeys

dN/dt = -c1N —– (2)

Equation (1) are conceptually similar to the Smoluchowski equations for aggregation with a constant reaction rate. Mathematically, however, they appear to be more complex and we have been unable to solve them analytically. Fortunately, equation (1) is amenable to a scaling solution. For this purpose, we first re-write equation (1) as

dck/dt = -ck(ck + ck+1) + N(ck-1 – ck) + (ck+1 – ck-1)∑j=kcj —– (3)

Taking the continuum limit and substituting the scaling ansatz,

ck(t) ≅ N2C(x), with x = kN —– (4)

transforms equations (2) and (3) to

dN/dt = -C(0)N3 —– (5)

and

C(0)[2C + xC’] = 2C2 + C'[1 – 2∫xdyC(y)] —– (6)

where C ′ = dC/dx. Note also that the scaling function must obey the integral relations

xdxC(x) = 1 and ∫xdxxC(x) = 1 —– (7)

The former follows from the definition of density, N = ∑ck(t) ≅ N∫dx C(x), while the latter follows if we set, without loss of generality, the conserved wealth density equal to unity, ∑kkck(t) = 1.

Introducing B(x) = ∫0x dyC(y) recasts equation (6) into C(0)[2B′ + xB′′] = 2B′2 + B′′[2B − 1]. Integrating twice gives [C(0)x − B][B − 1] = 0, with solution B(x) = C(0)x for x < xf and B(x) = 1 for x ≥ xf, from which we conclude that the scaled wealth distribution C(x) = B′(x) coincides with the zero-temperature Fermi distribution;

C(x) = C(0), for x < xf

= 0, for x ≥ xf —– (8)

Hence the scaled profile has a sharp front at x = xf, with xf = 1/C(0) found by matching the two branches of the solution for B(x). Making use of the second integral relation, equation (7), gives C(0) = 1/2 and thereby closes the solution. Thus, the unscaled wealth distribution ck(t) reads,

ck(t) = 1/(2t), for k < 2√t

= 0, for k ≥ 2√t —– (9)

and the total density is N(t) = t-1/2

Untitled

Figure: Simulation results for the wealth distribution in greedy additive exchange based on 2500 configurations for 106 traders. Shown are the scaled distributions C(x) versus x = kN for t = 1.5n, with n = 18, 24, 30, and 36; these steepen with increasing time. Each data set has been av- eraged over a range of ≈ 3% of the data points to reduce fluctuations.

These predictions by numerical simulations are shown in the figure. In the simulation, two individuals are randomly chosen to undergo greedy exchange and this process is repeated. When an individual reaches zero capital he is eliminated from the system, and the number of active traders is reduced by one. After each reaction, the time is incremented by the inverse of the number of active traders. While the mean-field predictions are substantially corroborated, the scaled wealth distribution for finite time actually resembles a finite-temperature Fermi distribution. As time increases, the wealth distribution becomes sharper and approaches equation (9). In analogy with the Fermi distribution, the relative width of the front may be viewed as an effective temperature. Thus the wealth distribution is characterized by two scales; one of order √t characterizes the typical wealth of active traders and a second, smaller scale which characterizes the width of the front.

To quantify the spreading of the front, let us include the next corrections in the continuum limit of the rate equations, equation (3). This gives,

∂c/∂t = 2∂/∂k [c∫kdjc(j)] – c∂c/∂k – N∂c/∂k + N/2 ∂2c/∂k2 —– (10)

Here, the second and fourth terms on the RHS denote the second corrections. since, the convective third term determines the location of the front to be at kf = 2√t, it is natural to expect that the diffusive fourth term describes the spreading of the front. the term c∂c/∂k  turns out to be negligible in comparison to the diffusive spreading term and is henceforth neglected. The dominant convective term can be removed by transforming to a frame of reference which moves with the front namely, k → K = k − 2√t. among the remaining terms in the transformed rate equation, the width of the front region W can now be determined by demanding that the diffusion term has the same order of magnitude as the reactive terms, i.e. N ∂2c/∂k∼ c2. This implies W ∼ √(N/c). Combining this with N = t−1/2 and c ∼ t−1 gives W ∼ t1/4, or a relative width w = W/kf ∼ t−1/4. This suggests the appropriate scaling ansatz for the front region is

ck(t) = 1/t X(ξ), ξ = (k – 2√t)/ t1/4 —– (11)

Substituting this ansatz into equation (10) gives a non-linear single variable integro-differential equation for the scaling function X(ξ). Together with the appropriate boundary conditions, this represents, in principle, a more complete solution to the wealth distribution. However, the essential scaling behavior of the finite-time spreading of the front is already described by equation (11), so that solving for X(ξ) itself does not provide additional scaling information. Analysis gives w ∼ t−α with α ≅ 1/5. We attribute this discrepancy to the fact that w is obtained by differentiating C(x), an operation which generally leads to an increase in numerical errors.

Fortune of the Individuals Restricted to Integers: Random Economic Exchange Between Populations of Traders.

thinkstockphotos_493208894

Consider a population of traders, each of which possesses a certain amount of capital which is assumed to be quantized in units of minimal capital. Taking this latter quantity as the basic unit, the fortune of an individual is restricted to the integers. The wealth of the population evolves by the repeated interaction of random pairs of traders. In each interaction, one unit of capital is transferred between the trading partners. To complete the description, we specify that if a poorest individual (with one unit of capital) loses all remaining capital by virtue of a “loss”, the bankrupt individual is considered to be economically dead and no longer participates in economic activity.

In the following, we consider a specific realization of additive capital exchange, the “random” exchange, where the direction of the capital exchange is independent of the relative capital of the traders. While this rule has little economic basis, the model is completely soluble and thus provides a helpful pedagogical point.

In a random exchange, one unit of capital is exchanged between trading partners as represented by the reaction scheme (j, k) → (j ± 1, k ∓ 1). Let ck(t) be the density of individuals with capital k. within a mean-field description, ck(t) evolves according to

dck(t)/dt = N(t) [ck+1(t) + ck-1(t) – 2ck(t)] —– (1)

with N(t) ≡ M0(t) = ∑k=1 ck(t), the population density. The first two terms account for gain in ck(t) due to the interactions (j, k + 1) → (j + 1, k) and (j, k − 1) → (j−1, k), respectively, while the last term accounts for the loss in ck(t) due to the interactions (j, k) → (j±1, k∓1).

By defining a modified time variable,

T = ∫0dt’N(t’) —– (2)

equation (1) is reduced to the discrete diffusion equation

dck(T)/dT = ck+1(T) + ck-1(T) – 2ck(T) —– (3)

The rate equation for the poorest density has the slightly different form, dc1/dT = c2 − 2c1, but may be written in the same form as equation (3) if we impose the boundary condition c0(T) = 0.

For illustrative purposes, let us assume that initially all individuals have one unit of capital, ck(0) = δk1. The solution to equation (3) subject to these initial and boundary conditions is

ck(T) = e−2T [Ik−1(2T) − Ik+1(2T)] —– (4)

where In denotes the modified Bessel function of order n. consequently, the total density N(t) is

N(T) = e−2T [I0(2T) + I1(2T)] —– (5)

To re-express this exact solution in terms of the physical time t, we first invert equation (2) to obtain t(T) = ∫0T dT′/N(T′), and then eliminate T in favor of t in the solution for ck(T). For simplicity and concreteness, let us consider the long-time limit. From equation (4),

ck(T) ≅ k/√(4πT3) exp (-k2/4T) —– (6)

and from equation (5),

N(T) ≅ (πT)−1/2 —– (7)

Equation (7) also implies t ≅ 2/3 √(πT3) which gives

N(T) ≅ (2/3πt)1/3 —– (8)

and

ck(t) ≅ k/3t exp [-(π/144)1/3 k2/t2/3] —– (9)

Note that this latter expression may be written in the scaling form ck(t) ∝ N2xe−x2, with the scaling variable x ∝ kN. One can also confirm that the scaling solution represents the basin of attraction for almost all exact solutions. Indeed, for any initial condition with ck(0) decaying faster than k−2, the system reaches the scaling limit ck(t) ∝ N2xe−x2. On the other hand, if ck(0) ∼ k−1−α, with 0 < α < 1, such an initial state converges to an alternative scaling limit which depends on α. These solutions exhibit a slower decay of the total density, N ∼ t−α/(1+α), while the scaling form of the wealth distribution is

ck(t) ∼ N2/αCα(x), x ∝ kN1/α —– (10)

with the scaling function

Cα(x) = e−x20 du e−u2 sinh(2ux)/u1+α —– (11)

Evaluating the integral by the Laplace method gives an asymptotic distribution which exhibits the same x−1−α as the initial distribution. This anomalous scaling in the solution to the diffusion equation is a direct consequence of the extended initial condition. This latter case is not physically relevant, however, since the extended initial distribution leads to a divergent initial wealth density.

Arbitrage, or Tensors thereof…

qs_2015_1

What is an arbitrage? Basically it means ”to get something from nothing” and a free lunch after all. More strict definition states the arbitrage as an operational opportunity to make a risk-free profit with a rate of return higher than the risk-free interest rate accured on deposit.

The arbitrage appears in the theory when we consider a curvature of the connection. A rate of excess return for an elementary arbitrage operation (a difference between rate of return for the operation and the risk-free interest rate) is an element of curvature tensor calculated from the connection. It can be understood keeping in mind that a curvature tensor elements are related to a difference between two results of infinitesimal parallel transports performed in different order. In financial terms it means that the curvature tensor elements measure a difference in gains accured from two financial operations with the same initial and final points or, in other words, a gain from an arbitrage operation.

In a certain sense, the rate of excess return for an elementary arbitrage operation is an analogue of the electromagnetic field. In an absence of any uncertanty (or, in other words, in an absense of walks of prices, exchange and interest rates) the only state is realised is the state of zero arbitrage. However, if we place the uncertenty in the game, prices and the rates move and some virtual arbitrage possibilities to get more than less appear. Therefore we can say that the uncertanty play the same role in the developing theory as the quantization did for the quantum gauge theory.

What of “matter” fields then, which interact through the connection. The “matter” fields are money flows fields, which have to be gauged by the connection. Dilatations of money units (which do not change a real wealth) play a role of gauge transformation which eliminates the effect of the dilatation by a proper tune of the connection (interest rate, exchange rates, prices and so on) exactly as the Fisher formula does for the real interest rate in the case of an inflation. The symmetry of the real wealth to a local dilatation of money units (security splits and the like) is the gauge symmetry of the theory.

A theory may contain several types of the “matter” fields which may differ, for example, by a sign of the connection term as it is for positive and negative charges in the electrodynamics. In the financial stage it means different preferances of investors. Investor’s strategy is not always optimal. It is due to partially incomplete information in hands, choice procedure, partially, because of investors’ (or manager’s) internal objectives. Physics of Finance

 

 

The Left Needs the Stupid to Survive…

324768490_337fd9a3a7_o

Social pathologies, or the social pathologist undoubtedly. Orwell developed his Newspeak dictionary in order to explain the cognitive phenomenon he observed about him with regard to those committed to the left. Thats not to say that the cognitive phenomenon cannot be on the right, since many mass movement type ideologies are logically contradictory and to sustain themselves their adherents must engage themselves in mental gyrations to upkeep their belief. Orwell needed the Newspeak as part of the apparatus of totalitarian control, something forced on to an unwitting and unwilling public. It never occurred to Orwell that the masses would never care as long as their animal desires were being provided for. The party, much like the Juvenal before them, recognized that the public would not much care about the higher concepts such as truth or freedom as ling as their bread and circuses, in the form of the cynical statement Prolefeed were supplied. In fact, trying to pry them away from such materialities or ‘truth’ would likely cause them the to support the existing regime. This means that a capitalist totalitarianism, with its superior ability to provide for material goods would be harder to dislodge than a socialist one.

Take for example the notion of Doublethink, the idea of keeping two mutually opposing ideas in one’s head without noticing the difference. Orwell saw this mode as an aberration with regard to normal thought but never realized the fact that this was in the common man a mode of cognition. Or the concept of Bellyfeel, which Orwell states,

Consider, for example, a typical sentence from a Times leading article as “Oldthinkers unbellyfeel Ingsoc”. the shortest rendering one could make of this in Oldspeak would be: “Those whose ideas formed before the revolution cannot have a full understanding of the principle of English socialism.” But, this is not an adequate translation…only a person thoroughly grounded in Ingsoc could appreciate the full force of the word bellyful, which implied a blind, enthusiastic and casual acceptance difficult to imagine today.

“Gut-Instinct”, more than reason, is mass man’s mechanism of political orientation. This is why Fascism and Socialism is better understood as appeals to the gut-brain rather than logically and empirically justified modes of political thought. Totalitarian regimes cannot solely rely on oppression for their survival, they also need to rely on some of cooperation  amongst the population, and they bring this about by exploiting the cognitive miserliness of the average man. Orwell, just like many other left-wing intellectuals never really appreciated the mindset of just outside the proletariat that he was. His fundamental misunderstanding of Newspeak lay in the assumption of rationalist fallacy, which assumes that the average man is rational when it counts, but the problem lies in the fact that for the average man cognitive miserliness is the norm. the problem is that a lot of mainstream conservative thought is based on this premise, which in turn undermines its own survival and helps feed the leftist beast. Any conservatives that believes in the right of the conservative miser to choose is a dead man walking. This criticism of the prole-mind is not based on any snobbery, rather it is of functional basis. Competency, not class should be the eligibility for decision-making, and thus no wonder left needs the stupid to survive.

Malignant Acceleration in Tech-Finance. Some Further Rumination on Regulations. Thought of the Day 72.1

these-stunning-charts-show-some-of-the-wild-trading-activity-that-came-from-a-dark-pool-this-morning

Regardless of the positive effects of HFT that offers, such as reduced spreads, higher liquidity, and faster price discovery, its negative side is mostly what has caught people’s attention. Several notorious market failures and accidents in recent years all seem to be related to HFT practices. They showed how much risk HFT can involve and how huge the damage can be.

HFT heavily depends on the reliability of the trading algorithms that generate, route, and execute orders. High-frequency traders thus must ensure that these algorithms have been tested completely and thoroughly before they are deployed into the live systems of the financial markets. Any improperly-tested, or prematurely-released algorithms may cause losses to both investors and the exchanges. Several examples demonstrate the extent of the ever-present vulnerabilities.

In August 2012, the Knight Capital Group implemented a new liquidity testing software routine into its trading system, which was running live on the NYSE. The system started making bizarre trading decisions, quadrupling the price of one company, Wizzard Software, as well as bidding-up the price of much larger entities, such as General Electric. Within 45 minutes, the company lost USD 440 million. After this event and the weakening of Knight Capital’s capital base, it agreed to merge with another algorithmic trading firm, Getco, which is the biggest HFT firm in the U.S. today. This example emphasizes the importance of implementing precautions to ensure their algorithms are not mistakenly used.

Another example is Everbright Securities in China. In 2013, state-owned brokerage firm, Everbright Securities Co., sent more than 26,000 mistaken buy orders to the Shanghai Stock Exchange (SSE of RMB 23.4 billion (USD 3.82 billion), pushing its benchmark index up 6 % in two minutes. This resulted in a trading loss of approximately RMB 194 million (USD 31.7 million). In a follow-up evaluative study, the China Securities Regulatory Commission (CSRC) found that there were significant flaws in Everbright’s information and risk management systems.

The damage caused by HFT errors is not limited to specific trading firms themselves, but also may involve stock exchanges and the stability of the related financial market. On Friday, May 18, 2012, the social network giant, Facebook’s stock was issued on the NASDAQ exchange. This was the most anticipated initial public offering (IPO) in its history. However, technology problems with the opening made a mess of the IPO. It attracted HFT traders, and very large order flows were expected, and before the IPO, NASDAQ was confident in its ability to deal with the high volume of orders.

But when the deluge of orders to buy, sell and cancel trades came, NASDAQ’s trading software began to fail under the strain. This resulted in a 30-minute delay on NASDAQ’s side, and a 17-second blackout for all stock trading at the exchange, causing further panic. Scrutiny of the problems immediately led to fines for the exchange and accusations that HFT traders bore some responsibility too. Problems persisted after opening, with many customer orders from institutional and retail buyers unfilled for hours or never filled at all, while others ended up buying more shares than they had intended. This incredible gaffe, which some estimates say cost traders USD 100 million, eclipsed NASDAQ’s achievement in getting Facebook’s initial IPO, the third largest IPO in U.S. history. This incident has been estimated to have cost investors USD 100 million.

Another instance occurred on May 6, 2010, when U.S. financial markets were surprised by what has been referred to ever since as the “Flash Crash” Within less than 30 minutes, the main U.S. stock markets experienced the single largest price declines within a day, with a decline of more than 5 % for many U.S.-based equity products. In addition, the Dow Jones Industrial Average (DJIA), at its lowest point that day, fell by nearly 1,000 points, although it was followed by a rapid rebound. This brief period of extreme intraday volatility demonstrated the weakness of the structure and stability of U.S. financial markets, as well as the opportunities for volatility-focused HFT traders. Although a subsequent investigation by the SEC cleared high-frequency traders of directly having caused the Flash Crash, they were still blamed for exaggerating market volatility, withdrawing liquidity for many U.S.-based equities (FLASH BOYS).

Since the mid-2000s, the average trade size in the U.S. stock market had plummeted, the markets had fragmented, and the gap in time between the public view of the markets and the view of high-frequency traders had widened. The rise of high-frequency trading had been accompanied also by a rise in stock market volatility – over and above the turmoil caused by the 2008 financial crisis. The price volatility within each trading day in the U.S. stock market between 2010 and 2013 was nearly 40 percent higher than the volatility between 2004 and 2006, for instance. There were days in 2011 in which volatility was higher than in the most volatile days of the dot-com bubble. Although these different incidents have different causes, the effects were similar and some common conclusions can be drawn. The presence of algorithmic trading and HFT in the financial markets exacerbates the adverse impacts of trading-related mistakes. It may lead to extremely higher market volatility and surprises about suddenly-diminished liquidity. This raises concerns about the stability and health of the financial markets for regulators. With the continuous and fast development of HFT, larger and larger shares of equity trades were created in the U.S. financial markets. Also, there was mounting evidence of disturbed market stability and caused significant financial losses due to HFT-related errors. This led the regulators to increase their attention and effort to provide the exchanges and traders with guidance on HFT practices They also expressed concerns about high-frequency traders extracting profit at the costs of traditional investors and even manipulating the market. For instance, high-frequency traders can generate a large amount of orders within microseconds to exacerbate a trend. Other types of misconduct include: ping orders, which is using some orders to detect other hidden orders; and quote stuffing, which is issuing a large number of orders to create uncertainty in the market. HFT creates room for these kinds of market abuses, and its blazing speed and huge trade volumes make their detection difficult for regulators. Regulators have taken steps to increase their regulatory authority over HFT activities. Some of the problems that arose in the mid-2000s led to regulatory hearings in the United States Senate on dark pools, flash orders and HFT practices. Another example occurred after the Facebook IPO problem. This led the SEC to call for a limit up-limit down mechanism at the exchanges to prevent trades in individual securities from occurring outside of a specified price range so that market volatility will be under better control. These regulatory actions put stricter requirements on HFT practices, aiming to minimize the market disturbance when many fast trading orders occur within a day.

Some content on this page was disabled on May 30, 2018 as a result of a DMCA takedown notice from W.W. Norton. You can learn more about the DMCA here:

https://en.support.wordpress.com/copyright-and-the-dmca/

Regulating the Velocities of Dark Pools. Thought of the Day 72.0

hft-robots630

On 22 September 2010 the SEC chair Mary Schapiro signaled US authorities were considering the introduction of regulations targeted at HFT:

…High frequency trading firms have a tremendous capacity to affect the stability and integrity of the equity markets. Currently, however, high frequency trading firms are subject to very little in the way of obligations either to protect that stability by promoting reasonable price continuity in tough times, or to refrain from exacerbating price volatility.

However regulating an industry working towards moving as fast as the speed of light is no ordinary administrative task: – Modern finance is undergoing a fundamental transformation. Artificial intelligence, mathematical models, and supercomputers have replaced human intelligence, human deliberation, and human execution…. Modern finance is becoming cyborg finance – an industry that is faster, larger, more complex, more global, more interconnected, and less human. C W Lin proposes a number of principles for regulating this cyber finance industry:

  1. Update antiquated paradigms of reasonable investors and compartmentalised institutions, and confront the emerging institutional realities, and realise the old paradigms of governance of markets may be ill-suited for the new finance industry;
  2. Enhance disclosure which recognises the complexity and technological capacities of the new finance industry;
  3. Adopt regulations to moderate the velocities of finance realising that as these approach the speed of light they may contain more risks than rewards for the new financial industry;
  4. Introduce smarter coordination harmonising financial regulation beyond traditional spaces of jurisdiction.

Electronic markets will require international coordination, surveillance and regulation. The high-frequency trading environment has the potential to generate errors and losses at a speed and magnitude far greater than that in a floor or screen-based trading environment… Moreover, issues related to risk management of these technology-dependent trading systems are numerous and complex and cannot be addressed in isolation within domestic financial markets. For example, placing limits on high-frequency algorithmic trading or restricting Un-filtered sponsored access and co-location within one jurisdiction might only drive trading firms to another jurisdiction where controls are less stringent.

In these regulatory endeavours it will be vital to remember that all innovation is not intrinsically good and might be inherently dangerous, and the objective is to make a more efficient and equitable financial system, not simply a faster system: Despite its fast computers and credit derivatives, the current financial system does not seem better at transferring funds from savers to borrowers than the financial system of 1910. Furthermore as Thomas Piketty‘s Capital in the Twenty-First Century amply demonstrates any thought of the democratisation of finance induced by the huge expansion of superannuation funds together with the increased access to finance afforded by credit cards and ATM machines, is something of a fantasy, since levels of structural inequality have endured through these technological transformations. The tragedy is that under the guise of technological advance and sophistication we could be destroying the capacity of financial markets to fulfil their essential purpose, as Haldane eloquently states:

An efficient capital market transfers savings today into investment tomorrow and growth the day after. In that way, it boosts welfare. Short-termism in capital markets could interrupt this transfer. If promised returns the day after tomorrow fail to induce saving today, there will be no investment tomorrow. If so, long-term growth and welfare would be the casualty.

Momentum of Accelerated Capital. Note Quote.

high-frequency-trading

Distinct types of high frequency trading firms include independent proprietary firms, which use private funds and specific strategies which remain secretive, and may act as market makers generating automatic buy and sell orders continuously throughout the day. Broker-dealer proprietary desks are part of traditional broker-dealer firms but are not related to their client business, and are operated by the largest investment banks. Thirdly hedge funds focus on complex statistical arbitrage, taking advantage of pricing inefficiencies between asset classes and securities.

Today strategies using algorithmic trading and High Frequency Trading play a central role on financial exchanges, alternative markets, and banks‘ internalized (over-the-counter) dealings:

High frequency traders typically act in a proprietary capacity, making use of a number of strategies and generating a very large number of trades every single day. They leverage technology and algorithms from end-to-end of the investment chain – from market data analysis and the operation of a specific trading strategy to the generation, routing, and execution of orders and trades. What differentiates HFT from algorithmic trading is the high frequency turnover of positions as well as its implicit reliance on ultra-low latency connection and speed of the system.

The use of algorithms in computerised exchange trading has experienced a long evolution with the increasing digitalisation of exchanges:

Over time, algorithms have continuously evolved: while initial first-generation algorithms – fairly simple in their goals and logic – were pure trade execution algos, second-generation algorithms – strategy implementation algos – have become much more sophisticated and are typically used to produce own trading signals which are then executed by trade execution algos. Third-generation algorithms include intelligent logic that learns from market activity and adjusts the trading strategy of the order based on what the algorithm perceives is happening in the market. HFT is not a strategy per se, but rather a technologically more advanced method of implementing particular trading strategies. The objective of HFT strategies is to seek to benefit from market liquidity imbalances or other short-term pricing inefficiencies.

While algorithms are employed by most traders in contemporary markets, the intense focus on speed and the momentary holding periods are the unique practices of the high frequency traders. As the defence of high frequency trading is built around the principles that it increases liquidity, narrows spreads, and improves market efficiency, the high number of trades made by HFT traders results in greater liquidity in the market. Algorithmic trading has resulted in the prices of securities being updated more quickly with more competitive bid-ask prices, and narrowing spreads. Finally HFT enables prices to reflect information more quickly and accurately, ensuring accurate pricing at smaller time intervals. But there are critical differences between high frequency traders and traditional market makers:

  1. HFT do not have an affirmative market making obligation, that is they are not obliged to provide liquidity by constantly displaying two sides quotes, which may translate into a lack of liquidity during volatile conditions.
  2. HFT contribute little market depth due to the marginal size of their quotes, which may result in larger orders having to transact with many small orders, and this may impact on overall transaction costs.
  3. HFT quotes are barely accessible due to the extremely short duration for which the liquidity is available when orders are cancelled within milliseconds.

Besides the shallowness of the HFT contribution to liquidity, are the real fears of how HFT can compound and magnify risk by the rapidity of its actions:

There is evidence that high-frequency algorithmic trading also has some positive benefits for investors by narrowing spreads – the difference between the price at which a buyer is willing to purchase a financial instrument and the price at which a seller is willing to sell it – and by increasing liquidity at each decimal point. However, a major issue for regulators and policymakers is the extent to which high-frequency trading, unfiltered sponsored access, and co-location amplify risks, including systemic risk, by increasing the speed at which trading errors or fraudulent trades can occur.

Although there have always been occasional trading errors and episodic volatility spikes in markets, the speed, automation and interconnectedness of today‘s markets create a different scale of risk. These risks demand that exchanges and market participants employ effective quality management systems and sophisticated risk mitigation controls adapted to these new dynamics in order to protect against potential threats to market stability arising from technology malfunctions or episodic illiquidity. However, there are more deliberate aspects of HFT strategies which may present serious problems for market structure and functioning, and where conduct may be illegal, for example in order anticipation seeks to ascertain the existence of large buyers or sellers in the marketplace and then to trade ahead of those buyers and sellers in anticipation that their large orders will move market prices. A momentum strategy involves initiating a series of orders and trades in an attempt to ignite a rapid price move. HFT strategies can resemble traditional forms of market manipulation that violate the Exchange Act:

  1. Spoofing and layering occurs when traders create a false appearance of market activity by entering multiple non-bona fide orders on one side of the market at increasing or decreasing prices in order to induce others to buy or sell the stock at a price altered by the bogus orders.
  2. Painting the tape involves placing successive small amount of buy orders at increasing prices in order to stimulate increased demand.

  3. Quote Stuffing and price fade are additional HFT dubious practices: quote stuffing is a practice that floods the market with huge numbers of orders and cancellations in rapid succession which may generate buying or selling interest, or compromise the trading position of other market participants. Order or price fade involves the rapid cancellation of orders in response to other trades.

The World Federation of Exchanges insists: ― Exchanges are committed to protecting market stability and promoting orderly markets, and understand that a robust and resilient risk control framework adapted to today‘s high speed markets, is a cornerstone of enhancing investor confidence. However this robust and resilient risk control framework‘ seems lacking, including in the dark pools now established for trading that were initially proposed as safer than the open market.

Production of the Schizoid, End of Capitalism and Laruelle’s Radical Immanence. Note Quote Didactics.

space

These are eclectics of the production, eclectics of the repetition, eclectics of the difference, where the fecundity of the novelty would either spring forth, or be weeded out. There is ‘schizoproduction’ prevalent in the world. This axiomatic schizoproduction is not a speech act, but discursive, in the sense that it constrains how meaning is distilled from relations, without the need for signifying, linguistic acts. Schizoproduction performs the relation. The bare minimum of schizoproduction is the gesture of transcending thought: namely, what François Laruelle calls a ‘decision’. Decision is differential, but it does not have to signify. It is the capacity to produce distinction and separation, in the most minimal, axiomatic form. Schizoproduction is capitalism turned into immanent capitalism, through a gesture of thought – sufficient thought. It is where capitalism has become a philosophy of life, in that it has a firm belief within a sufficient thought, whatever it comes in contact with. It is an expression of the real, the radical immanence as a transcending arrangement. It is a collective articulation bound up with intricate relations and management of carnal, affective, and discursive matter. The present form of capitalism is based on relationships, collaborations, and processuality, and in this is altogether different from the industrial period of modernism in the sense of subjectivity, production, governance, biopolitics and so on. In both cases, the life of a subject is valuable, since it is a substratum of potentiality and capacity, creativity and innovation; and in both cases, a subject is produced with physical, mental, cognitive and affective capacities compatible with each arrangement. Artistic practice is aligned with a shift from modern liberalism to the neoliberal dynamic position of the free agent.

Such attributes have thus become so obvious that the concepts of ‘competence’, ‘trust’ or ‘interest’ are taken as given facts, instead of perceiving them as functions within an arrangement. It is not that neoliberal management has leveraged the world from its joints, but that it is rather capitalism as philosophy, which has produced this world, where neoliberalism is just a part of the philosophy. Therefore, the thought of the end of capitalism will always be speculative, since we may regard the world without capitalism in the same way as we may regard the world-not-for-humans, which may be a speculative one, also. From its inception, capitalism paved a one-way path to annihilation, predicated as it was on unmitigated growth, the extraction of finite resources, the exaltation of individualism over communal ties, and the maximization of profit at the expense of the environment and society. The capitalist world was, as Thurston Clarke described so bleakly, ”dominated by the concerns of trade and Realpolitik rather than by human rights and spreading democracy”; it was a ”civilization influenced by the impersonal, bottom-line values of the corporations.” Capitalist industrial civilization was built on burning the organic remains of ancient organisms, but at the cost of destroying the stable climatic conditions which supported its very construction. The thirst for fossil fuels by our globalized, high-energy economy spurred increased technological development to extract the more difficult-to-reach reserves, but this frantic grasp for what was left only served to hasten the malignant transformation of Earth into an alien world. The ruling class tried to hold things together for as long as they could by printing money, propping up markets, militarizing domestic law enforcement, and orchestrating thinly veiled resource wars in the name of fighting terrorism, but the crisis of capitalism was intertwined with the ecological crisis and could never be solved by those whose jobs and social standing depended on protecting the status quo. All the corporate PR, greenwashing, political promises, cultural myths, and anthropocentrism could not hide the harsh Malthusian reality of ecological overshoot. As crime sky-rocketed and social unrest boiled over into rioting and looting, the elite retreated behind walled fortresses secured by armed guards, but the great unwinding of industrial civilization was already well underway. This evil genie was never going back in the bottle. And thats speculative too, or not really is a nuance to be fought hard on.

The immanence of capitalism is a transcending immanence: a system, which produces a world as an arrangement, through a capitalist form of thought—the philosophy of capitalism—which is a philosophy of sufficient reason in which economy is the determination in the last instance, and not the real. We need to specifically regard that this world is not real. The world is a process, a “geopolitical fiction”. Aside from this reason, there is an unthinkable world that is not for humans. It is not the world in itself, noumena, nor is it nature, bios, but rather it is the world indifferent to and foreclosed from human thought, a foreclosed and radical immanence – the real – which is not open nor will ever be opening itself for human thought. It will forever remain void and unilaterally indifferent. The radical immanence of the real is not an exception – analogous to the miracle in theology – but rather, it is an advent of the unprecedented unknown, where the lonely hour of last instance never comes. This radical immanence does not confer with ‘the new’ or with ‘the same’ and does not transcend through thought. It is matter in absolute movement, into which philosophy or oikonomia incorporates conditions, concepts, and operations. Now, a shift in thought is possible where the determination in the last instance would no longer be economy but rather a radical immanence of the real, as philosopher François Laruelle has argued. What is given, what is radically immanent in and as philosophy, is the mode of transcendental knowledge in which it operates. To know this mode of knowledge, to know it without entering into its circle, is to practice a science of the transcendental, the “transcendental science” of non-philosophy. This science is of the transcendental, but according to Laruelle, it must also itself be transcendental – it must be a global theory of the given-ness of the real. A non- philosophical transcendental is required if philosophy as a whole, including its transcendental structure, is to be received and known as it is. François Laruelle radicalises the Marxist term of determined-in-the-last-instance reworked by Louis Althusser, for whom the last instance as a dominating force was the economy. For Laruelle, the determination-in-the-last-instance is the Real and that “everything philosophy claims to master is in-the-last-instance thinkable from the One-Real”. For Althusser, referring to Engels, the economy is the ‘determination in the last instance’ in the long run, but only concerning the other determinations by the superstructures such as traditions. Following this, the “lonely hour of the ‘last instance’ never comes”.