Triadomania. Thought of the Day 117.0

figure-2

Peirce’s famous ‘triadomania’ lets most of his decisive distinctions appear in threes, following the tripartition of his list of categories, the famous triad of First, Second, and Third, or Quality, Reaction, Representation, or Possibility, Actuality, Reality.

Firstness is the mode of being of that which is such as it is, positively and without reference to anything else.

Secondness is the mode of being of that which is such as it is, with respect to a second but regardless of any third.

Thirdness is the mode of being of that which is such as it is, in bringing a second and third into relation to each other.

Firstness constitutes the quality of experience: in order for something to appear at all, it must do so due to a certain constellation of qualitative properties. Peirce often uses sensory qualities as examples, but it is important for the understanding of his thought that the examples may refer to phenomena very far from our standard conception of ‘sensory data’, e.g. forms or the ‘feeling’ of a whole melody or of a whole mathematical proof, not to be taken in a subjective sense but as a concept for the continuity of melody or proof as a whole, apart from the analytical steps and sequences in which it may be, subsequently, subdivided. In short, all sorts of simple and complex Gestalt qualities also qualify as Firstnesses. Firstness tend to form continua of possibilities such as the continua of shape, color, tone, etc. These qualities, however, are, taken in themselves, pure possibilities and must necessarily be incarnated in phenomena in order to appear. Secondness is the phenomenological category of ‘incarnation’ which makes this possible: it is the insistency, then, with which the individuated, actualized, existent phenomenon appears. Thus, Secondness necessarily forms discontinuous breaks in Firstness, allowing for particular qualities to enter into existence. The mind may imagine anything whatever in all sorts of quality combinations, but something appears with an irrefutable insisting power, reacting, actively, yielding resistance. Peirce’s favorite example is the resistance of the closed door – which might be imagined reduced to the quality of resistance feeling and thus degenerate to pure Firstness so that his theory imploded into a Hume-like solipsism – but to Peirce this resistance, surprise, event, this thisness, ‘haecceity’ as he calls it with a Scotist term, remains irreducible in the description of the phenomenon (a Kantian idea, at bottom: existence is no predicate). About Thirdness, Peirce may directly state that continuity represents it perfectly: ‘continuity and generality are two names of the same absence of distinction of individuals’. As against Secondness, Thirdness is general; it mediates between First and Second. The events of Secondness are never completely unique, such an event would be inexperiencable, but relates (3) to other events (2) due to certain features (1) in them; Thirdness is thus what facilitates understanding as well as pragmatic action, due to its continuous generality. With a famous example: if you dream about an apple pie, then the very qualities of that dream (taste, smell, warmth, crustiness, etc.) are pure Firstnesses, while the act of baking is composed of a series of actual Secondnesses. But their coordination is governed by a Thirdness: the recipe, being general, can never specify all properties in the individual apple pie, it has a schematic frame-character and subsumes an indefinite series – a whole continuum – of possible apple pies. Thirdness is thus necessarily general and vague. Of course, the recipe may be more or less precise, but no recipe exists which is able to determine each and every property in the cake, including date, hour, place, which tree the apples stem from, etc. – any recipe is necessarily general. In this case, the recipe (3) mediates between dream (1) and fulfilment (2) – its generality, symbolicity, relationality and future orientation are all characteristic for Thirdness. An important aspect of Peirce’s realism is that continuous generality may be experienced directly in perceptual judgments: ‘Generality, Thirdness, pours in upon us in our very perceptual judgments’.

All these determinations remain purely phenomenological, even if the later semiotic and metaphysical interpretations clearly shine through. In a more general, non-Peircean terminology, his phenomenology can be seen as the description of minimum aspects inherent in any imaginable possible world – for this reason it is imaginability which is the main argument, and this might point in the direction that Peirce could be open to critique for subjectivism, so often aimed at Husserl’s project, in some respects analogous. The concept of consciousness is invoked as the basis of imaginability: phenomenology is the study of invariant properties in any phenomenon appearing for a mind. Peirce’s answer would here be, on the one hand, the research community which according to him defines reality – an argument which structurally corresponds to Husserl’s reference to intersubjectivity as a necessary ingredient in objectivity (an object is a phenomenon which is intersubjectively accessible). Peirce, however, has a further argument here, namely his consequent refusal to delimit his concept of mind exclusively to human subjects (a category the use of which he obviously tries to minimize), mind-like processes may take place in nature without any subject being responsible. Peirce will, for continuity reasons, never accept any hard distinction between subject and object and remains extremely parsimonious in the employment of such terms.

From Peirce’s New Elements of Mathematics (The New Elements of Mathematics Vol. 4),

But just as the qualities, which as they are for themselves, are equally unrelated to one other, each being mere nothing for any other, yet form a continuum in which and because of their situation in which they acquire more or less resemblance and contrast with one another; and then this continuum is amplified in the continuum of possible feelings of quality, so the accidents of reaction, which are waking consciousnesses of pairs of qualities, may be expected to join themselves into a continuum. 

Since, then an accidental reaction is a combination or bringing into special connection of two qualities, and since further it is accidental and antigeneral or discontinuous, such an accidental reaction ought to be regarded as an adventitious singularity of the continuum of possible quality, just as two points of a sheet of paper might come into contact.

But although singularities are discontinuous, they may be continuous to a certain extent. Thus the sheet instead of touching itself in the union of two points may cut itself all along a line. Here there is a continuous line of singularity. In like manner, accidental reactions though they are breaches of generality may come to be generalized to a certain extent.

Secondness is now taken to actualize these quality possibilities based on an idea that any actual event involves a clash of qualities – in the ensuing argumentation Peirce underlines that the qualities involved in actualization need not be restrained to two but may be many, if they may only be ‘dissolved’ into pairs and hence do not break into the domain of Thirdness. This appearance of actuality, hence, has the property of singularities, spontaneously popping up in the space of possibilities and actualizing pairs of points in it. This transition from First to Second is conceived of along Aristotelian lines: as an actualization of a possibility – and this is expressed in the picture of a discontinuous singularity in the quality continuum. The topological fact that singularities must in general be defined with respect to the neighborhood of the manifold in which they appear, now becomes the argument for the fact that Secondness can never be completely discontinuous but still ‘inherits’ a certain small measure of continuity from the continuum of Firstness. Singularities, being discontinuous along certain dimensions, may be continuous in others, which provides the condition of possibility for Thirdness to exist as a tendency for Secondness to conform to a general law or regularity. As is evident, a completely pure Secondness is impossible in this continuous metaphysics – it remains a conceivable but unrealizable limit case, because a completely discon- tinuous event would amount to nothing. Thirdness already lies as a germ in the non-discontinuous aspects of the singularity. The occurrences of Secondness seem to be infinitesimal, then, rather than completely extensionless points.

Advertisement

Fundamental Theorem of Asset Pricing: Tautological Meeting of Mathematical Martingale and Financial Arbitrage by the Measure of Probability.

thinkstockphotos-496599823

The Fundamental Theorem of Asset Pricing (FTAP hereafter) has two broad tenets, viz.

1. A market admits no arbitrage, if and only if, the market has a martingale measure.

2. Every contingent claim can be hedged, if and only if, the martingale measure is unique.

The FTAP is a theorem of mathematics, and the use of the term ‘measure’ in its statement places the FTAP within the theory of probability formulated by Andrei Kolmogorov (Foundations of the Theory of Probability) in 1933. Kolmogorov’s work took place in a context captured by Bertrand Russell, who observed that

It is important to realise the fundamental position of probability in science. . . . As to what is meant by probability, opinions differ.

In the 1920s the idea of randomness, as distinct from a lack of information, was becoming substantive in the physical sciences because of the emergence of the Copenhagen Interpretation of quantum mechanics. In the social sciences, Frank Knight argued that uncertainty was the only source of profit and the concept was pervading John Maynard Keynes’ economics (Robert Skidelsky Keynes the return of the master).

Two mathematical theories of probability had become ascendant by the late 1920s. Richard von Mises (brother of the Austrian economist Ludwig) attempted to lay down the axioms of classical probability within a framework of Empiricism, the ‘frequentist’ or ‘objective’ approach. To counter–balance von Mises, the Italian actuary Bruno de Finetti presented a more Pragmatic approach, characterised by his claim that “Probability does not exist” because it was only an expression of the observer’s view of the world. This ‘subjectivist’ approach was closely related to the less well-known position taken by the Pragmatist Frank Ramsey who developed an argument against Keynes’ Realist interpretation of probability presented in the Treatise on Probability.

Kolmogorov addressed the trichotomy of mathematical probability by generalising so that Realist, Empiricist and Pragmatist probabilities were all examples of ‘measures’ satisfying certain axioms. In doing this, a random variable became a function while an expectation was an integral: probability became a branch of Analysis, not Statistics. Von Mises criticised Kolmogorov’s generalised framework as un-necessarily complex. About a decade and a half back, the physicist Edwin Jaynes (Probability Theory The Logic Of Science) champions Leonard Savage’s subjectivist Bayesianism as having a “deeper conceptual foundation which allows it to be extended to a wider class of applications, required by current problems of science”.

The objections to measure theoretic probability for empirical scientists can be accounted for as a lack of physicality. Frequentist probability is based on the act of counting; subjectivist probability is based on a flow of information, which, following Claude Shannon, is now an observable entity in Empirical science. Measure theoretic probability is based on abstract mathematical objects unrelated to sensible phenomena. However, the generality of Kolmogorov’s approach made it flexible enough to handle problems that emerged in physics and engineering during the Second World War and his approach became widely accepted after 1950 because it was practically more useful.

In the context of the first statement of the FTAP, a ‘martingale measure’ is a probability measure, usually labelled Q, such that the (real, rather than nominal) price of an asset today, X0, is the expectation, using the martingale measure, of its (real) price in the future, XT. Formally,

X0 = EQ XT

The abstract probability distribution Q is defined so that this equality exists, not on any empirical information of historical prices or subjective judgement of future prices. The only condition placed on the relationship that the martingale measure has with the ‘natural’, or ‘physical’, probability measures usually assigned the label P, is that they agree on what is possible.

The term ‘martingale’ in this context derives from doubling strategies in gambling and it was introduced into mathematics by Jean Ville in a development of von Mises’ work. The idea that asset prices have the martingale property was first proposed by Benoit Mandelbrot in response to an early formulation of Eugene Fama’s Efficient Market Hypothesis (EMH), the two concepts being combined by Fama. For Mandelbrot and Fama the key consequence of prices being martingales was that the current price was independent of the future price and technical analysis would not prove profitable in the long run. In developing the EMH there was no discussion on the nature of the probability under which assets are martingales, and it is often assumed that the expectation is calculated under the natural measure. While the FTAP employs modern terminology in the context of value-neutrality, the idea of equating a current price with a future, uncertain, has ethical ramifications.

The other technical term in the first statement of the FTAP, arbitrage, has long been used in financial mathematics. Liber Abaci Fibonacci (Laurence Sigler Fibonaccis Liber Abaci) discusses ‘Barter of Merchandise and Similar Things’, 20 arms of cloth are worth 3 Pisan pounds and 42 rolls of cotton are similarly worth 5 Pisan pounds; it is sought how many rolls of cotton will be had for 50 arms of cloth. In this case there are three commodities, arms of cloth, rolls of cotton and Pisan pounds, and Fibonacci solves the problem by having Pisan pounds ‘arbitrate’, or ‘mediate’ as Aristotle might say, between the other two commodities.

Within neo-classical economics, the Law of One Price was developed in a series of papers between 1954 and 1964 by Kenneth Arrow, Gérard Debreu and Lionel MacKenzie in the context of general equilibrium, in particular the introduction of the Arrow Security, which, employing the Law of One Price, could be used to price any asset. It was on this principle that Black and Scholes believed the value of the warrants could be deduced by employing a hedging portfolio, in introducing their work with the statement that “it should not be possible to make sure profits” they were invoking the arbitrage argument, which had an eight hundred year history. In the context of the FTAP, ‘an arbitrage’ has developed into the ability to formulate a trading strategy such that the probability, under a natural or martingale measure, of a loss is zero, but the probability of a positive profit is not.

To understand the connection between the financial concept of arbitrage and the mathematical idea of a martingale measure, consider the most basic case of a single asset whose current price, X0, can take on one of two (present) values, XTD < XTU, at time T > 0, in the future. In this case an arbitrage would exist if X0 ≤ XTD < XTU: buying the asset now, at a price that is less than or equal to the future pay-offs, would lead to a possible profit at the end of the period, with the guarantee of no loss. Similarly, if XTD < XTU ≤ X0, short selling the asset now, and buying it back would also lead to an arbitrage. So, for there to be no arbitrage opportunities we require that

XTD < X0 < XTU

This implies that there is a number, 0 < q < 1, such that

X0 = XTD + q(XTU − XTD)

= qXTU + (1−q)XTD

The price now, X0, lies between the future prices, XTU and XTD, in the ratio q : (1 − q) and represents some sort of ‘average’. The first statement of the FTAP can be interpreted simply as “the price of an asset must lie between its maximum and minimum possible (real) future price”.

If X0 < XTD ≤ XTU we have that q < 0 whereas if XTD ≤ XTU < X0 then q > 1, and in both cases q does not represent a probability measure which by Kolmogorov’s axioms, must lie between 0 and 1. In either of these cases an arbitrage exists and a trader can make a riskless profit, the market involves ‘turpe lucrum’. This account gives an insight as to why James Bernoulli, in his moral approach to probability, considered situations where probabilities did not sum to 1, he was considering problems that were pathological not because they failed the rules of arithmetic but because they were unfair. It follows that if there are no arbitrage opportunities then quantity q can be seen as representing the ‘probability’ that the XTU price will materialise in the future. Formally

X0 = qXTU + (1−q) XTD ≡ EQ XT

The connection between the financial concept of arbitrage and the mathematical object of a martingale is essentially a tautology: both statements mean that the price today of an asset must lie between its future minimum and maximum possible value. This first statement of the FTAP was anticipated by Frank Ramsey when he defined ‘probability’ in the Pragmatic sense of ‘a degree of belief’ and argues that measuring ‘degrees of belief’ is through betting odds. On this basis he formulates some axioms of probability, including that a probability must lie between 0 and 1. He then goes on to say that

These are the laws of probability, …If anyone’s mental condition violated these laws, his choice would depend on the precise form in which the options were offered him, which would be absurd. He could have a book made against him by a cunning better and would then stand to lose in any event.

This is a Pragmatic argument that identifies the absence of the martingale measure with the existence of arbitrage and today this forms the basis of the standard argument as to why arbitrages do not exist: if they did the, other market participants would bankrupt the agent who was mis-pricing the asset. This has become known in philosophy as the ‘Dutch Book’ argument and as a consequence of the fact/value dichotomy this is often presented as a ‘matter of fact’. However, ignoring the fact/value dichotomy, the Dutch book argument is an alternative of the ‘Golden Rule’– “Do to others as you would have them do to you.”– it is infused with the moral concepts of fairness and reciprocity (Jeffrey Wattles The Golden Rule).

FTAP is the ethical concept of Justice, capturing the social norms of reciprocity and fairness. This is significant in the context of Granovetter’s discussion of embeddedness in economics. It is conventional to assume that mainstream economic theory is ‘undersocialised’: agents are rational calculators seeking to maximise an objective function. The argument presented here is that a central theorem in contemporary economics, the FTAP, is deeply embedded in social norms, despite being presented as an undersocialised mathematical object. This embeddedness is a consequence of the origins of mathematical probability being in the ethical analysis of commercial contracts: the feudal shackles are still binding this most modern of economic theories.

Ramsey goes on to make an important point

Having any definite degree of belief implies a certain measure of consistency, namely willingness to bet on a given proposition at the same odds for any stake, the stakes being measured in terms of ultimate values. Having degrees of belief obeying the laws of probability implies a further measure of consistency, namely such a consistency between the odds acceptable on different propositions as shall prevent a book being made against you.

Ramsey is arguing that an agent needs to employ the same measure in pricing all assets in a market, and this is the key result in contemporary derivative pricing. Having identified the martingale measure on the basis of a ‘primal’ asset, it is then applied across the market, in particular to derivatives on the primal asset but the well-known result that if two assets offer different ‘market prices of risk’, an arbitrage exists. This explains why the market-price of risk appears in the Radon-Nikodym derivative and the Capital Market Line, it enforces Ramsey’s consistency in pricing. The second statement of the FTAP is concerned with incomplete markets, which appear in relation to Arrow-Debreu prices. In mathematics, in the special case that there are as many, or more, assets in a market as there are possible future, uncertain, states, a unique pricing vector can be deduced for the market because of Cramer’s Rule. If the elements of the pricing vector satisfy the axioms of probability, specifically each element is positive and they all sum to one, then the market precludes arbitrage opportunities. This is the case covered by the first statement of the FTAP. In the more realistic situation that there are more possible future states than assets, the market can still be arbitrage free but the pricing vector, the martingale measure, might not be unique. The agent can still be consistent in selecting which particular martingale measure they choose to use, but another agent might choose a different measure, such that the two do not agree on a price. In the context of the Law of One Price, this means that we cannot hedge, replicate or cover, a position in the market, such that the portfolio is riskless. The significance of the second statement of the FTAP is that it tells us that in the sensible world of imperfect knowledge and transaction costs, a model within the framework of the FTAP cannot give a precise price. When faced with incompleteness in markets, agents need alternative ways to price assets and behavioural techniques have come to dominate financial theory. This feature was realised in The Port Royal Logic when it recognised the role of transaction costs in lotteries.

Fascism. Drunken Risibility

fiat

You must create your life, as you’d create a work of art. It’s necessary that the life of an intellectual be artwork with him as the subject. True superiority is all here. At all costs, you must preserve liberty, to the point of intoxication. — Gabriele d’Annunzio

The complex relationship between fascism and modernity cannot be resolved all at once, and with a simple yes or no. It has to be developed in the unfolding story of fascism’s acquisition and exercise of power. The most satisfactory work on this matter shows how antimodernizing resentments were channeled and neutralized, step by step, in specific legislation, by more powerful pragmatic and intellectual forces working in the service of an alternate modernity.

The word fascism has its root in the Italian fascio, literally a bundle or sheaf. More remotely, the word recalled the Latin fasces, an axe encased in a bundle of rods that was carried before the magistrates in Roman public processions to signify the authority and unity of the state. Before 1914, the symbolism of the Roman fasces was usually appropriated by the Left. Marianne, symbol of the French Republic, was often portrayed in the nineteenth century carrying the fasces to represent the force of Republican solidarity against her aristocratic and clerical enemies. Italian revolutionaries used the term fascio in the late nineteenth century to evoke the solidarity of committed militants. The peasants who rose against their landlords in Sicily in 1893–94 called themselves the Fasci Siciliani. When in late 1914 a group of left-wing nationalists, soon joined by the socialist outcast Benito Mussolini, sought to bring Italy into World War I on the Allied side, they chose a name designed to communicate both the fervor and the solidarity of their campaign: the Fascio Rivoluzionario d’Azione Interventista (Revolutionary League for Interventionist Action). At the end of World War I, Mussolini coined the term fascismo to describe the mood of the little band of nationalist ex-soldiers and pro-war syndicalist revolutionaries that he was gathering around himself. Even then, he had no monopoly on the word fascio, which remained in general use for activist groups of various political hues. Officially, Fascism was born in Milan on Sunday, March 23, 1919. That morning, somewhat more than a hundred persons, including war veterans, syndicalists who had supported the war, and Futurist intellectuals, plus some reporters and the merely curious, gathered in the meeting room of the Milan Industrial and Commercial Alliance, overlooking the Piazza San Sepolcro, to “declare war against socialism . . . because it has opposed nationalism.” Now Mussolini called his movement the Fasci di Combattimento, which means, very approximately, “fraternities of combat.”

Definitions are inherently limiting. They frame a static picture of something that is better perceived in movement, and they portray as “frozen ‘statuary’” something that is better understood as a process. They succumb all too often to the intellectual’s temptation to take programmatic statements as constitutive, and to identify fascism more with what it said than with what it did. The quest for the perfect definition, by reducing fascism to one ever more finely honed phrase, seems to shut off questions about the origins and course of fascist development rather than open them up. Fascism, by contrast, was a new invention created afresh for the era of mass politics. It sought to appeal mainly to the emotions by the use of ritual, carefully stage-managed ceremonies, and intensely charged rhetoric. The role programs and doctrine play in it is, on closer inspection, fundamentally unlike the role they play in conservatism, liberalism, and socialism. Fascism does not rest explicitly upon an elaborated philosophical system, but rather upon popular feelings about master races, their unjust lot, and their rightful predominance over inferior peoples. It has not been given intellectual underpinnings by any system builder, like Marx, or by any major critical intelligence, like Mill, Burke, or Tocqueville. In a way utterly unlike the classical “isms,” the rightness of fascism does not depend on the truth of any of the propositions advanced in its name. Fascism is “true” insofar as it helps fulfill the destiny of a chosen race or people or blood, locked with other peoples in a Darwinian struggle, and not in the light of some abstract and universal reason.