Glue Code + Pipeline Jungles. Thought of the Day 25.0


Machine learning researchers tend to develop general purpose solutions as self-contained packages. A wide variety of these are available as open-source packages at places like, or from in-house code, proprietary packages, and cloud-based platforms. Using self-contained solutions often results in a glue code system design pattern, in which a massive amount of supporting code is written to get data into and out of general-purpose packages.

This glue code design pattern can be costly in the long term, as it tends to freeze a system to the peculiarities of a specific package. General purpose solutions often have different design goals: they seek to provide one learning system to solve many problems, but many practical software systems are highly engineered to apply to one large-scale problem, for which many experimental solutions are sought. While generic systems might make it possible to interchange optimization algorithms, it is quite often refactoring of the construction of the problem space which yields the most benefit to mature systems. The glue code pattern implicitly embeds this construction in supporting code instead of in principally designed components. As a result, the glue code pattern often makes experimentation with other machine learning approaches prohibitively expensive, resulting in an ongoing tax on innovation.

Glue code can be reduced by choosing to re-implement specific algorithms within the broader system architecture. At first, this may seem like a high cost to pay – reimplementing a machine learning package in C++ or Java that is already available in R or matlab, for example, may appear to be a waste of effort. But the resulting system may require dramatically less glue code to integrate in the overall system, be easier to test, be easier to maintain, and be better designed to allow alternate approaches to be plugged in and empirically tested. Problem-specific machine learning code can also be tweaked with problem-specific knowledge that is hard to support in general packages.

As a special case of glue code, pipeline jungles often appear in data preparation. These can evolve organically, as new signals are identified and new information sources added. Without care, the resulting system for preparing data in an ML-friendly format may become a jungle of scrapes, joins, and sampling steps, often with intermediate files output. Managing these pipelines, detecting errors and recovering from failures are all difficult and costly. Testing such pipelines often requires expensive end-to-end integration tests. All of this adds to technical debt of a system and makes further innovation more costly. It’s worth noting that glue code and pipeline jungles are symptomatic of integration issues that may have a root cause in overly separated “research” and “engineering” roles. When machine learning packages are developed in an ivory-tower setting, the resulting packages may appear to be more like black boxes to the teams that actually employ them in practice.

The Silicon Ideology


Traditional anti-fascist tactics have largely been formulated in response to 20th century fascism. Not confident that they will be sufficient to defeat neo-reactionaries. That is not to say they will not be useful; merely insufficient. Neo-reactionaries must be fought on their own ground (the internet), and with their own tactics: doxxing especially, which has been shown to be effective at threatening the alt-right. Information must be spread about neo-reactionaries, such that they lose opportunities to accumulate capital and social capital….

…Transhumanism, for many, seems to be the part of neo-reactionary ideology that “sticks out” from the rest. Indeed, some wonder how neo-reactionaries and transhumanists would ever mix, and why I am discussing LessWrong in the context of neo-reactionary beliefs. For the last question, this is because LessWrong served as a convenient “incubation centre” so to speak for neo-reactionary ideas to develop and spread for many years, and the goals of LessWrong: a friendly super-intelligent AI ruling humanity  for its own good, was fundamentally compatible with existing neo-reactionary ideology, which had already begun developing a futurist orientation in its infancy due, in part, to its historical and cultural influences. The rest of the question, however, is not just historical, but theoretical: what is transhumanism and why does it mix well with reactionary ideology?…..

…..In the words of Moldbug

A startup is basically structured as a monarchy. We don’t call it that, of course. That would seem weirdly outdated, and anything that’s not democracy makes people uncomfortable. We are biased toward the democratic-republican side of the spectrum. That’s what we’re used to from civics classes. But, the truth is that startups and founders lean toward the dictatorial side because that structure works better for startups.

He doesn’t, of course, claim that this would be a good way to rule a country, but that is the clear message sent by his political projects. Balaji Srinivasan made a similar rhetorical move, using clear neo-reactionary ideas without mentioning their sources, in a speech to a “startup school” affiliated with Y Combinator:

We want to show what a society run by Silicon Valley would look like. That’s where “exit” comes in . . . . It basically means: build an opt-in society, ultimately outside the US, run by technology. And this is actually where the Valley is going. This is where we’re going over the next ten years . . . [Google co-founder] Larry Page, for example, wants to set aside a part of the world for unregulated experimentation. That’s carefully phrased. He’s not saying, “take away the laws in the U.S.” If you like your country, you can keep it. Same with Marc Andreessen: “The world is going to see an explosion of countries in the years ahead—doubled, tripled, quadrupled countries.”

Well, thats the the-silicon-ideology through.


Nick Land – The Dark Enlightenment: Neoreaction & Modernity

Nick on a fascinating discussion on Neoreaction, progressivism, and the future of Western civilization. Initially a libertarian, Nick eventually stumbled upon the work of Mencius Moldbug. This leads to a consideration of the central ideas of Neoreaction, and then over to discuss neocameralism, an approach to governance that incorporates free market forces. Nick then describes the cathedral, which refers to the alliance of institutions – NGOs, corporations, academia, etc. – responsible for promoting progressivism. Later, Nick draws an apt comparison between the invention of the printing press and the rise of the Internet. The first hour touches on much more, including universalism, modernity, and multiculturalism.

In the members’ hour (which sadly is a paid service!), gears are switched to consider potential solutions to the West’s decline. Nick argues that we must dismantle the managerial state, explaining that the Right shouldn’t count on being able to control it indefinitely. There is then a discussion on Islam and European politics. Nick explains how Europe’s ongoing demographic transformation is poised to drastically alter the European political landscape. Later, we talk about the need for an ideological divorce, as Western nations have become far too divided to function properly. Nick argues that such a divorce would allow for competition between different political factions, revealing which lead to success and which do not. The show concludes with a consideration of whether or not the Left will realize its mistakes before it’s too late.

High Frequency Traders: A Case in Point.

Events on 6th May 2010:

At 2:32 p.m., against [a] backdrop of unusually high volatility and thinning liquidity, a large fundamental trader (a mutual fund complex) initiated a sell program to sell a total of 75,000 E-Mini [S&P 500 futures] contracts (valued at approximately $4.1 billion) as a hedge to an existing equity position. […] This large fundamental trader chose to execute this sell program via an automated execution algorithm (“Sell Algorithm”) that was programmed to feed orders into the June 2010 E-Mini market to target an execution rate set to 9% of the trading volume calculated over the previous minute, but without regard to price or time. The execution of this sell program resulted in the largest net change in daily position of any trader in the E-Mini since the beginning of the year (from January 1, 2010 through May 6, 2010). [. . . ] This sell pressure was initially absorbed by: high frequency traders (“HFTs”) and other intermediaries in the futures market; fundamental buyers in the futures market; and cross-market arbitrageurs who transferred this sell pressure to the equities markets by opportunistically buying E-Mini contracts and simultaneously selling products like SPY [(S&P 500 exchange-traded fund (“ETF”))], or selling individual equities in the S&P 500 Index. […] Between 2:32 p.m. and 2:45 p.m., as prices of the E-Mini rapidly declined, the Sell Algorithm sold about 35,000 E-Mini contracts (valued at approximately $1.9 billion) of the 75,000 intended. [. . . ] By 2:45:28 there were less than 1,050 contracts of buy-side resting orders in the E-Mini, representing less than 1% of buy-side market depth observed at the beginning of the day. [. . . ] At 2:45:28 p.m., trading on the E-Mini was paused for five seconds when the Chicago Mercantile Exchange (“CME”) Stop Logic Functionality was triggered in order to prevent a cascade of further price declines. […] When trading resumed at 2:45:33 p.m., prices stabilized and shortly thereafter, the E-Mini began to recover, followed by the SPY. [. . . ] Even though after 2:45 p.m. prices in the E-Mini and SPY were recovering from their severe declines, sell orders placed for some individual securities and Exchange Traded Funds (ETFs) (including many retail stop-loss orders, triggered by declines in prices of those securities) found reduced buying interest, which led to further price declines in those securities. […] [B]etween 2:40 p.m. and 3:00 p.m., over 20,000 trades (many based on retail-customer orders) across more than 300 separate securities, including many ETFs, were executed at prices 60% or more away from their 2:40 p.m. prices. [. . . ] By 3:08 p.m., [. . . ] the E-Mini prices [were] back to nearly their pre-drop level [. . . and] most securities had reverted back to trading at prices reflecting true consensus values.

In the ordinary course of business, HFTs use their technological advantage to profit from aggressively removing the last few contracts at the best bid and ask levels and then establishing new best bids and asks at adjacent price levels ahead of an immediacy-demanding customer. As an illustration of this “immediacy absorption” activity, consider the following stylized example, presented in Figure and described below.

Untitled 2

Suppose that we observe the central limit order book for a stock index futures contract. The notional value of one stock index futures contract is $50. The market is very liquid – on average there are hundreds of resting limit orders to buy or sell multiple contracts at either the best bid or the best offer. At some point during the day, due to temporary selling pressure, there is a total of just 100 contracts left at the best bid price of 1000.00. Recognizing that the queue at the best bid is about to be depleted, HFTs submit executable limit orders to aggressively sell a total of 100 contracts, thus completely depleting the queue at the best bid, and very quickly submit sequences of new limit orders to buy a total of 100 contracts at the new best bid price of 999.75, as well as to sell 100 contracts at the new best offer of 1000.00. If the selling pressure continues, then HFTs are able to buy 100 contracts at 999.75 and make a profit of $1,250 dollars among them. If, however, the selling pressure stops and the new best offer price of 1000.00 attracts buyers, then HFTs would very quickly sell 100 contracts (which are at the very front of the new best offer queue), “scratching” the trade at the same price as they bought, and getting rid of the risky inventory in a few milliseconds.

This type of trading activity reduces, albeit for only a few milliseconds, the latency of a price move. Under normal market conditions, this trading activity somewhat accelerates price changes and adds to the trading volume, but does not result in a significant directional price move. In effect, this activity imparts a small “immediacy absorption” cost on all traders, including the market makers, who are not fast enough to cancel the last remaining orders before an imminent price move.

This activity, however, makes it both costlier and riskier for the slower market makers to maintain continuous market presence. In response to the additional cost and risk, market makers lower their acceptable inventory bounds to levels that are too small to offset temporary liquidity imbalances of any significant size. When the diminished liquidity buffer of the market makers is pierced by a sudden order flow imbalance, they begin to demand a progressively greater compensation for maintaining continuous market presence, and prices start to move directionally. Just as the prices are moving directionally and volatility is elevated, immediacy absorption activity of HFTs can exacerbate a directional price move and amplify volatility. Higher volatility further increases the speed at which the best bid and offer queues are being depleted, inducing HFT algorithms to demand immediacy even more, fueling a spike in trading volume, and making it more costly for the market makers to maintain continuous market presence. This forces more risk averse market makers to withdraw from the market, which results in a full-blown market crash.

Empirically, immediacy absorption activity of the HFTs should manifest itself in the data very differently from the liquidity provision activity of the Market Makers. To establish the presence of these differences in the data, we test the following hypotheses:

Hypothesis H1: HFTs are more likely than Market Makers to aggressively execute the last 100 contracts before a price move in the direction of the trade. Market Makers are more likely than HFTs to have the last 100 resting contracts against which aggressive orders are executed.

Hypothesis H2: HFTs trade aggressively in the direction of the price move. Market Makers get run over by a price move.

Hypothesis H3: Both HFTs and Market Makers scratch trades, but HFTs scratch more.

To statistically test our “immediacy absorption” hypotheses against the “liquidity provision” hypotheses, we divide all of the trades during the 405 minute trading day into two subsets: Aggressive Buy trades and Aggressive Sell trades. Within each subset, we further aggregate multiple aggressive buy or sell transactions resulting from the execution of the same order into Aggressive Buy or Aggressive Sell sequences. The intuition is as follows. Often a specific trade is not a stand alone event, but a part of a sequence of transactions associated with the execution of the same order. For example, an order to aggressively sell 10 contracts may result in four Aggressive Sell transactions: for 2 contracts, 1 contract, 4 contracts, and 3 contracts, respectively, due to the specific sequence of resting bids against which this aggressive sell order was be executed. Using the order ID number, we are able to aggregate these four transactions into one Aggressive Sell sequence for 10 contracts.

Testing Hypothesis H1. Aggressive removal of the last 100 contracts by HFTs; passive provision of the last 100 resting contracts by the Market Makers. Using the Aggressive Buy sequences, we label as a “price increase event” all occurrences of trading sequences in which at least 100 contracts consecutively executed at the same price are followed by some number of contracts at a higher price. To examine indications of low latency, we focus on the the last 100 contracts traded before the price increase and the first 100 contracts at the next higher price (or fewer if the price changes again before 100 contracts are executed). Although we do not look directly at the limit order book data, price increase events are defined to capture occasions where traders use executable buy orders to lift the last remaining offers in the limit order book. Using Aggressive sell trades, we define “price decrease events” symmetrically as occurrences of sequences of trades in which 100 contracts executed at the same price are followed by executions at lower prices. These events are intended to capture occasions where traders use executable sell orders to hit the last few best bids in the limit order book. The results are presented in Table below

Untitled 2

For price increase and price decrease events, we calculate each of the six trader categories’ shares of Aggressive and Passive trading volume for the last 100 contracts traded at the “old” price level before the price increase or decrease and the first 100 contracts traded at the “new” price level (or fewer if the number of contracts is less than 100) after the price increase or decrease event.

Table above presents, for the six trader categories, volume shares for the last 100 contracts at the old price and the first 100 contracts at the new price. For comparison, the unconditional shares of aggressive and passive trading volume of each trader category are also reported. Table has four panels covering (A) price increase events on May 3-5, (B) price decrease events on May 3-5, (C) price increase events on May 6, and (D) price decrease events on May 6. In each panel there are six rows of data, one row for each trader category. Relative to panels A and C, the rows for Fundamental Buyers (BUYER) and Fundamental Sellers (SELLER) are reversed in panels B and D to emphasize the symmetry between buying during price increase events and selling during price decrease events. The first two columns report the shares of Aggressive and Passive contract volume for the last 100 contracts before the price change; the next two columns report the shares of Aggressive and Passive volume for up to the next 100 contracts after the price change; and the last two columns report the “unconditional” market shares of Aggressive and Passive sides of all Aggressive buy volume or sell volume. For May 3-5, the data are based on volume pooled across the three days.

Consider panel A, which describes price increase events associated with Aggressive buy trades on May 3-5, 2010. High Frequency Traders participated on the Aggressive side of 34.04% of all aggressive buy volume. Strongly consistent with immediacy absorption hypothesis, the participation rate rises to 57.70% of the Aggressive side of trades on the last 100 contracts of Aggressive buy volume before price increase events and falls to 14.84% of the Aggressive side of trades on the first 100 contracts of Aggressive buy volume after price increase events.

High Frequency Traders participated on the Passive side of 34.33% of all aggressive buy volume. Consistent with hypothesis, the participation rate on the Passive side of Aggressive buy volume falls to 28.72% of the last 100 contracts before a price increase event. It rises to 37.93% of the first 100 contracts after a price increase event.

These results are inconsistent with the idea that high frequency traders behave like textbook market makers, suffering adverse selection losses associated with being picked off by informed traders. Instead, when the price is about to move to a new level, high frequency traders tend to avoid being run over and take the price to the new level with Aggressive trades of their own.

Market Makers follow a noticeably more passive trading strategy than High Frequency Traders. According to panel A, Market Makers are 13.48% of the Passive side of all Aggressive trades, but they are only 7.27% of the Aggressive side of all Aggressive trades. On the last 100 contracts at the old price, Market Makers’ share of volume increases only modestly, from 7.27% to 8.78% of trades. Their share of Passive volume at the old price increases, from 13.48% to 15.80%. These facts are consistent with the interpretation that Market Makers, unlike High Frequency Traders, do engage in a strategy similar to traditional passive market making, buying at the bid price, selling at the offer price, and suffering losses when the price moves against them. These facts are also consistent with the hypothesis that High Frequency Traders have lower latency than Market Makers.

Intuition might suggest that Fundamental Buyers would tend to place the Aggressive trades which move prices up from one tick level to the next. This intuition does not seem to be corroborated by the data. According to panel A, Fundamental Buyers are 21.53% of all Aggressive trades but only 11.61% of the last 100 Aggressive contracts traded at the old price. Instead, Fundamental Buyers increase their share of Aggressive buy volume to 26.17% of the first 100 contracts at the new price.

Taking into account symmetry between buying and selling, panel B shows the results for Aggressive sell trades during May 3-5, 2010, are almost the same as the results for Aggressive buy trades. High Frequency Traders are 34.17% of all Aggressive sell volume, increase their share to 55.20% of the last 100 Aggressive sell contracts at the old price, and decrease their share to 15.04% of the last 100 Aggressive sell contracts at the new price. Market Makers are 7.45% of all Aggressive sell contracts, increase their share to only 8.57% of the last 100 Aggressive sell trades at the old price, and decrease their share to 6.58% of the last 100 Aggressive sell contracts at the new price. Fundamental Sellers’ shares of Aggressive sell trades behave similarly to Fundamental Buyers’ shares of Aggressive Buy trades. Fundamental Sellers are 20.91% of all Aggressive sell contracts, decrease their share to 11.96% of the last 100 Aggressive sell contracts at the old price, and increase their share to 24.87% of the first 100 Aggressive sell contracts at the new price.

Panels C and D report results for Aggressive Buy trades and Aggressive Sell trades for May 6, 2010. Taking into account symmetry between buying and selling, the results for Aggressive buy trades in panel C are very similar to the results for Aggressive sell trades in panel D. For example, Aggressive sell trades by Fundamental Sellers were 17.55% of Aggressive sell volume on May 6, while Aggressive buy trades by Fundamental Buyers were 20.12% of Aggressive buy volume on May 6. In comparison with the share of Fundamental Buyers and in comparison with May 3-5, the Flash Crash of May 6 is associated with a slightly lower – not higher – share of Aggressive sell trades by Fundamental Sellers.

The number of price increase and price decrease events increased dramatically on May 6, consistent with the increased volatility of the market on that day. On May 3-5, there were 4100 price increase events and 4062 price decrease events. On May 6 alone, there were 4101 price increase events and 4377 price decrease events. There were therefore approximately three times as many price increase events per day on May 6 as on the three preceding days.

A comparison of May 6 with May 3-5 reveals significant changes in the trading patterns of High Frequency Traders. Compared with May 3-5 in panels A and B, the share of Aggressive trades by High Frequency Traders drops from 34.04% of Aggressive buys and 34.17% of Aggressive sells on May 3-5 to 26.98% of Aggressive buy trades and 26.29% of Aggressive sell trades on May 6. The share of Aggressive trades for the last 100 contracts at the old price declines by even more. High Frequency Traders’ participation rate on the Aggressive side of Aggressive buy trades drops from 57.70% on May 3-5 to only 38.86% on May 6. Similarly, the participation rate on the Aggressive side of Aggressive sell trades drops from and 55.20% to 38.67%. These declines are largely offset by increases in the participation rate by Opportunistic Traders on the Aggressive side of trades. For example, Opportunistic Traders’ share of the Aggressive side of the last 100 contracts traded at the old price rises from 19.21% to 34.26% for Aggressive buys and from 20.99% to 33.86% for Aggressive sells. These results suggest that some Opportunistic Traders follow trading strategies for which low latency is important, such as index arbitrage, cross-market arbitrage, or opportunistic strategies mimicking market making.

Testing Hypothesis H2. HFTs trade aggressively in the direction of the price move; Market Makers get run over by a price move. To examine this hypothesis, we analyze whether High Frequency Traders use Aggressive trades to trade in the direction of contemporaneous price changes, while Market Makers use Passive trades to trade in the opposite direction from price changes. To this end, we estimate the regression equation

Δyt = α + Φ . Δyt-1 + δ . yt-1 + Σi=120i . Δpt-1 /0.25] + εt

(where yt and Δyt denote inventories and change in inventories of High Frequency Traders for each second of a trading day; t = 0 corresponds to the opening of stock trading on the NYSE at 8:30:00 a.m. CT (9:30:00 ET) and t = 24, 300 denotes the close of Globex at 15:15:00 CT (4:15 p.m. ET); Δpt denotes the price change in index point units between the high-low midpoint of second t-1 and the high-low midpoint of second t. Regressing second-by-second changes in inventory levels of High Frequency Traders on the level of their inventories the previous second, the change in their inventory levels the previous second, the change in prices during the current second, and lagged price changes for each of the previous 20 previous seconds.)

for Passive and Aggressive inventory changes separately.


Table above presents the regression results of the two components of change in holdings on lagged inventory, lagged change in holdings and lagged price changes over one second intervals. Panel A and Panel B report the results for May 3-5 and May 6, respectively. Each panel has four columns, reporting estimated coefficients where the dependent variables are net Aggressive volume (Aggressive buys minus Aggressive sells) by High Frequency Traders (∆AHFT), net Passive volume by High Frequency Traders (∆PHFT), net Aggressive volume by Market Makers (∆AMM), and net Passive volume by Market Makers (∆PMM).

We observe that for lagged inventories (NPHFTt−1), the estimated coefficients for Aggressive and Passive trades by High Frequency Traders are δAHFT = −0.005 (t = −9.55) and δPHFT = −0.001 (t = −3.13), respectively. These coefficient estimates have the interpretation that High Frequency Traders use Aggressive trades to liquidate inventories more intensively than passive trades. In contrast, the results for Market Makers are very different. For lagged inventories (NPMMt−1), the estimated coefficients for Aggressive and Passive volume by Market Makers are δAMM = −0.002 (t = −6.73) and δPMM = −0.002 (t = −5.26), respectively. The similarity of these coefficients estimates has the interpretation that Market Makers favor neither Aggressive trades nor Passive trades when liquidating inventories.

For contemporaneous price changes (in the current second) (∆Pt−1), the estimated coefficient Aggressive and Passive volume by High Frequency Traders are β0 = 57.78 (t = 31.94) and β0 = −25.69 (t = −28.61), respectively. For Market Makers, the estimated coefficients for Aggressive and Passive trades are β0 = 6.38 (t = 18.51) and β0 = −19.92 (t = −37.68). These estimated coefficients have the interpretation that in seconds in which prices move up one tick, High Frequency traders are net buyers of about 58 contracts with Aggressive trades and net sellers of about 26 contracts with Passive trades in that same second, while Market Makers are net buyers of about 6 contracts with Aggressive trades and net sellers of about 20 contracts with Passive trades. High Frequency Traders and Market Makers are similar in that they both use Aggressive trades to trade in the direction of price changes, and both use Passive trades to trade against the direction of price changes. High Frequency Traders and Market Makers are different in that Aggressive net purchases by High Frequency Traders are greater in magnitude than the Passive net purchases, while the reverse is true for Market Makers.

For lagged price changes, coefficient estimates for Aggressive trades by High Frequency Traders and Market Makers are positive and statistically significant at lags 1-4 and lags 1-10, respectively. These results have the interpretation that both High Frequency Traders’ and Market Makers’ trade on recent price momentum, but the trading is compressed into a shorter time frame for High Frequency Traders than for Market Makers.

For lagged price changes, coefficient estimates for Passive volume by High Frequency Traders and Market Makers are negative and statistically significant at lags 1 and lags 1-3, respectively. Panel B of Table presents results for May 6. Similar to May 3-5, High Frequency Traders tend to use Aggressive trades more intensely than Passive trades to liquidate inventories, while Market Makers do not show this pattern. Also similar to May 3-5, High Frequency Trades and Market makers use Aggressive trades to trade in the contemporaneous direction of price changes and use Passive trades to trade in the direction opposite price changes, with Aggressive trading greater than Passive trading for High Frequency Traders and the reverse for Market Makers. In comparison with May 3-5, the coefficients are smaller in magnitude on May 6, indicating reduced liquidity at each tick. For lagged price changes, the coefficients associated with Aggressive trading by High Frequency Traders change from positive to negative at lags 1-4, and the positive coefficients associated with Aggressive trading by Market Makers change from being positive and statistically significant at lags lags 1-10 to being positive and statistically significant only at lags 1-3. These results illustrate accelerated trading velocity in the volatile market conditions of May 6.

We further examine how high frequency trading activity is related to market prices. Figure below illustrates how prices change after HFT trading activity in a given second. The upper-left panel presents results for buy trades for May 3-5, the upper right panel presents results for buy trades on May 6, and the lower-left and lower-right present corresponding results for sell trades. For an “event” second in which High Frequency Traders are net buyers, net Aggressive Buyers, and net Passive Buyers value-weighted average prices paid by the High Frequency Traders in that second are subtracted from the value-weighted average prices for all trades in the same second and each of the following 20 seconds. The results are averaged across event seconds, weighted by the magnitude of High Frequency Traders’ net position change in the event second. The upper-left panel presents results for May 3-5, the upper-right panel presents results for May 6, and the lower two panels present results for sell trades calculated analogously. Price differences on the vertical axis are scaled so that one unit equals one tick ($12.50 per contract).

Untitled 2

When High Frequency Traders are net buyers on May 3-5, prices rise by 17% of a tick in the next second. When HFTs execute Aggressively or Passively, prices rise by 20% and 2% of a tick in the next second, respectively. In subsequent seconds, prices in all cases trend downward by about 5% of a tick over the subsequent 19 seconds. For May 3-5, the results are almost symmetric for selling.

When High Frequency Traders are buying on May 6, prices increase by 7% of a tick in the next second. When they are aggressive buyers or passive buyers, prices increase by increase 25% of a tick or decrease by 5% of a tick in the next second, respectively. In subsequent seconds, prices generally tend to drift downwards. The downward drift is especially pronounced after Passive buying, consistent with the interpretation that High Frequency Traders were “run over” when their resting limit buy orders were “run over” in the down phase of the Flash Crash. When High Frequency Traders are net sellers, the results after one second are analogous to buying. After aggressive selling, prices continue to drift down for 20 seconds, consistent with the interpretation that High Frequency Traders made profits from Aggressive sales during the down phase of the Flash Crash.

Testing Hypothesis H3. Both HFTs and Market Makers scratch trades; HFTs scratch more. A textbook market maker will try to buy at the bid price, sell at the offer price, and capture the bid-ask spread as a profit. Sometimes, after buying at the bid price, market prices begin to fall before the market maker can make a one tick profit by selling his inventory at the best offer price. To avoid taking losses in this situation, one component of a traditional market making strategy is to “scratch trades in the presence of changing market conditions by quickly liquidating a position at the same price at which it was acquired. These scratched trades represent inventory management trades designed to lower the cost of adverse selection. Since many competing market makers may try to scratch trades at the same time, traders with the lowest latency will tend to be more successful in their attempts to scratch trades and thus more successful in their ability to avoid losses when market conditions change.

To examine whether and to what extent traders engage in trade scratching, we sequence each trader’s trades for the day using audit trail sequence numbers which not only sort trades by second but also sort trades chronologically within each second. We define an “immediately scratched trade” as a trade with the properties that the next trade in the sorted sequence (1) occurred in the same second, (2) was executed at the same price, (3) was in the opposite direction, i.e., buy followed by sell or sell followed by buy. For each of the trading accounts in our sample, we calculate the number of immediately scratched trades, then compare the number of scratched trades across the six trader categories.

The results of this analysis are presented in the table below. Panel A provides results for May 3-5 and panel B for May 6. In each panel, there are five rows of data, one for each trader category. The first three columns report the total number of trades, the total number of immediately scratched trades, and the percentage of trades that are immediately scratched by traders in five categories. For May 3-6, the reported numbers are from the pooled data.

Untitled 2

This table presents statistics for immediate trade scratching which measures how many times a trader changes his/her direction of trading in a second aggregated over a day. We define a trade direction change as a buy trade right after a sell trade or vice versa at the same price level in the same second.

This table shows that High Frequency Traders scratched 2.84 % of trades on May 3-5 and 4.26 % on May 6; Market Makers scratched 2.49 % of trades on May 3-5 and 5.53 % of trades on May 6. While the percentages of immediately scratched trades by Market Makers is slightly higher than that for High Frequency Traders on May 6, the percentages for both groups are very similar. The fourth, fifth, and sixth columns of the Table report the mean, standard deviation, and median of the number of scratched trades for the traders in each category.

Although the percentages of scratched trades are similar, the mean number of immediately scratched trades by High Frequency Traders is much greater than for Market Makers: 540.56 per day on May 3-5 and 1610.75 on May 6 for High Frequency Traders versus 13.35 and 72.92 for Market Makers. The differences between High Frequency Traders and Market Makers reflect differences in volume traded. The Table shows that High Frequency Traders and Market Makers scratch a significantly larger percentage of their trades than other trader categories.

Phantom Originary Intentionality: Thought of the Day 16.0


Phantom limbs and anosognosias – cases of abnormal impressions of the presence or absence of parts of our body – seem like handy illustrations of an irreducible, first-person dimension of experience, of the sort that will delight the phenomenologist, who will say: aha! there is an empirical case of self-reference which externalist, third-person explanations of the type favoured by deflationary materialists, cannot explain away, cannot do away with. As Merleau-Ponty would say, and Varela after him, there is something about my body which makes it irreducibly my own (le corps propre). Whether illusory or not, such images (phantoms) have something about them such that we perceive them as our own, not someone else’s (well, some agnosias are different: thinking our paralyzed limb is precisely someone else’s, often a relative’s). One might then want to insist that phantom limbs testify to the transcendence of mental life! Indeed, in one of the more celebrated historical cases of phantom limb syndrome, Lord Horatio Nelson, having lost his right arm in a sea battle off of Tenerife, suffered from pains in his phantom hand. Most importantly, he apparently declared that this phantom experience was a “direct proof of the existence of the soul”. Although the materialist might agree with the (reformed) phenomenologist to reject dualism and accept that we are not in our bodies like a sailor in a ship, she might not want to go and declare, as Merleau-Ponty does, that “the mind does not use the body, but fulfills itself through it while at the same time transferring the body outside of physical space.” This way of talking goes back to the Husserlian distinction between Korper, ‘body’ in the sense of one body among others in a vast mechanistic universe of bodies, and Leib, ‘flesh’ in the sense of a subjectivity which is the locus of experience. Now, granted, in cognitivist terms one would want to say that a representation is always my representation, it is not ‘transferable’ like a neutral piece of information, since the way an object appear to me is always a function of my needs and interests. What my senses tell me at any given time relies on my interests as an agent and is determined by them, as described by Andy Clark, who appeals to the combined research traditions of the psychology of perception, new robotics, and Artificial Life. But the phenomenologist will take off from there and build a full-blown defense of intentionality, now recast as ‘motor intentionality’, a notion which goes back to Husserl’s claim in Ideas II that the way the body relates to the external world is crucially through “kinestheses”: all external motions which we perceive are first of all related to kinesthetic sensations, out of which we constitute a sense of space. On this view, our body thus already displays ‘originary intentionality’ in how it relates to the world.

The Political: NRx, Neoreactionism Archived.

This one is eclectic and for the record.


The techno-commercialists appear to have largely arrived at neoreaction via right-wing libertarianism. They are defiant free marketeers, sharing with other ultra-capitalists such as Randian Objectivists a preoccupation with “efficiency,” a blind trust in the power of the free market, private property, globalism and the onward march of technology. However, they are also believers in the ideal of small states, free movement and absolute or feudal monarchies with no form of democracy. The idea of “exit,” predominantly a techno-commercialist viewpoint but found among other neoreactionaries too, essentially comes down to the idea that people should be able to freely exit their native country if they are unsatisfied with its governance-essentially an application of market economics and consumer action to statehood. Indeed, countries are often described in corporate terms, with the King being the CEO and the aristocracy shareholders.

The “theonomists” place more emphasis on the religious dimension of neoreaction. They emphasise tradition, divine law, religion rather than race as the defining characteristic of “tribes” of peoples and traditional, patriarchal families. They are the closest group in terms of ideology to “classical” or, if you will, “palaeo-reactionaries” such as the High Tories, the Carlists and French Ultra-royalists. Often Catholic and often ultramontanist. Finally, there’s the “ethnicist” lot, who believe in racial segregation and have developed a new form of racial ideology called “Human Biodiversity” (HBD) which says people of African heritage are naturally less intelligent than people of Caucasian and east Asian heritage. Of course, the scientific community considers the idea that there are any genetic differences between human races beyond melanin levels in the skin and other cosmetic factors to be utterly false, but presumably this is because they are controlled by “The Cathedral.” They like “tribal solidarity,” tribes being defined by shared ethnicity, and distrust outsiders.


Overlap between these groups is considerable, but there are also vast differences not just between them but within them. What binds them together is common opposition to “The Cathedral” and to “progressive” ideology. Some of their criticisms of democracy and modern society are well-founded, and some of them make good points in defence of the monarchical system. However, I don’t much like them, and I doubt they’d much like me.

Whereas neoreactionaries are keen on the free market and praise capitalism, unregulated capitalism is something I am wary of. Capitalism saw the collapse of traditional monarchies in Europe in the 19th century, and the first revolutions were by capitalists seeking to establish democratic, capitalist republics where the bourgeoisie replaced the aristocratic elite as the ruling class; setting an example revolutionary socialists would later follow. Capitalism, when unregulated, leads to monopolies, exploitation of the working class, unsustainable practices in pursuit of increased short-term profits, globalisation and materialism. Personally, I prefer distributist economics, which embrace private property rights but emphasise widespread ownership of wealth, small partnerships and cooperatives replacing private corporations as the basic units of the nation’s economy. And although critical of democracy, the idea that any form of elected representation for the lower classes is anathaema is not consistent with my viewpoint; my ideal government would not be absolute or feudal monarchy, but executive constitutional monarchy with a strong monarch exercising executive powers and the legislative role being at least partially controlled by an elected parliament-more like the Bourbon Restoration than the Ancien Régime, though I occasionally say “Vive l’Ancien Régime!” on forums or in comments to annoy progressive types. Finally, I don’t believe in racialism in any form. I tend to attribute preoccupations with racial superiority to deep insecurity which people find the need to suppress by convincing themselves that they are “racially superior” to others, in absence of any actual talent or especial ability to take pride in. The 20th century has shown us where dividing people up based on their genetics leads us, and it is not somewhere I care to return to.

I do think it is important to go into why Reactionaries think Cthulhu always swims left, because without that they’re vulnerable to the charge that they have no a priori reason to expect our society to have the biases it does, and then the whole meta-suspicion of the modern Inquisition doesn’t work or at least doesn’t work in that particular direction. Unfortunately (for this theory) I don’t think their explanation is all that great (though this deserves substantive treatment) and we should revert to a strong materialist prior, but of course I would say that, wouldn’t I.

And of course you could get locked up for wanting fifty Stalins! Just try saying how great Enver Hoxha was at certain places and times. Of course saying you want fifty Stalins is not actually advocating that Stalinism become more like itself – as Leibniz pointed out, a neat way of telling whether something is something is checking whether it is exactly like that thing, and nothing could possibly be more like Stalinism than Stalinism. Of course fifty Stalins is further in the direction that one Stalin is from our implied default of zero Stalins. But then from an implied default of 1.3 kSt it’s a plea for moderation among hypostalinist extremists. As Mayberry Mobmuck himself says, “sovereign is he who determines the null hypothesis.”

Speaking of Stalinism, I think it does provide plenty of evidence that policy can do wonderful things for life expectancy and so on, and I mean that in a totally unironic “hail glorious comrade Stalin!” way, not in a “ha ha Stalin sure did kill a lot people way.” But this is a super-unintuitive claim to most people today, so ill try to get around to summarizing the evidence at some point.

‘Neath an eyeless sky, the inkblack sea
Moves softly, utters not save a quiet sound
A lapping-sound, not saying what may be
The reach of its voice a furthest bound;
And beyond it, nothing, nothing known
Though the wind the boat has gently blown
Unsteady on shifting and traceless ground
And quickly away from it has flown.

Allow us a map, and a lamp electric
That by instrument we may probe the dark
Unheard sounds and an unseen metric
Keep alive in us that unknown spark
To burn bright and not consume or mar
Has the unbounded one come yet so far
For night over night the days to mark
His journey — adrift, without a star?

Chaos is the substrate, and the unseen action (or non-action) against disorder, the interloper. Disorder is a mere ‘messing up order’.  Chaos is substantial where disorder is insubstantial. Chaos is the ‘quintessence’ of things, chaotic itself and yet always-begetting order. Breaking down disorder, since disorder is maladaptive. Exit is a way to induce bifurcation, to quickly reduce entropy through separation from the highly entropic system. If no immediate exit is available, Chaos will create one.

Viral Load


Even if viruses have been quarantined on a user’s system, the user is often not allowed to access the quarantined files. The ostensible reason for this high level of secrecy is the claim that open access to computer virus code would result in people writing more computer viruses – a difficult claim for an antivirus company to make given that once they themselves have a copy of a virus then machines running their antivirus software should already be protected from that virus. A more believable explanation for antivirus companies’ unwillingness to release past virus programs is that a large part of their business model is predicated upon their ability to exclusively control stockpiles of past computer virus specimens as closely guarded intellectual property.

None of this absence of archival material is helped by the fact that the concept of a computer virus is itself an ontologically ambiguous category. The majority of so-called malicious software entities that have plagued Internet users in the past few years have technically not been viruses but worms. Additionally, despite attempts to define clear nosological and epidemiological categories for computer viruses and worms, there is still no consistent system for stabilizing the terms themselves, let alone assessing their relative populations. Elizabeth Grosz commented during an interview with the editors of Found Object journal that part of the reason for the ontological ambiguity of computer viruses is that they are an application of a biological metaphor that is largely indeterminate itself. According to Grosz, we are as mystified, if not more so, by biological viruses as we are by computer viruses. Perhaps we know even more about computer viruses than we do about biological viruses! The same obscurities are there at the biological level that exists at the computer level (…)

As Grosz suggests, it is no wonder that computer viruses are so ontologically uncertain, given that their biological namesakes threaten to undermine many of the binarisms that anchor modern Western technoscience, such as distinctions between organic and inorganic, dead and living, matter and form, and sexual and asexual reproduction.

Viral Load