Complicated Singularities – Why Should the Discriminant Locus Change Under Dualizing?

Consider the surface S ⊆ (C)2 defined by the equation z1 + z2 + 1 = 0. Define the map log : (C)2 → R2 by log(z1, z2) = (log|z1|, log|z2|). Then log(S) can be seen as follows. Consider the image of S under the absolute value map.

Untitled

The line segment r1 + r2 = 1 with r1, r2 ≥ 0 is the image of {(−a, a−1)|0 < a < 1} ⊆ S; the ray r2 = r1 + 1 with r1 ≥ 0 is the image of {(−a, a−1)|a < 0} ⊆ S; and the ray r1 = r2 + 1 is the image of {(−a, a−1)|a > 1} ⊆ S. The map S → |S| is one-to-one on the boundary of |S| and two-to-one in the interior, with (z1, z2) and (z̄1, z̄2) mapping to the same point in |S|. Taking the logarithm of this picture, we obtain the amoeba of S, log(S) as depicted below.

Untitled

Now consider S = S × {0} ⊆ Y = (C)2 × R = T2 × R3. We can now obtain a six-dimensional space X, with a map π : X → Y, an S1-bundle over Y\S degenerating over S, so that π−1(S) → S. We then have a T3-fibration on X, f : X → R3, by composing π with the map (log, id) : (C)2 × R → R3 = B. Clearly the discriminant locus of f is log(S) × {0}. If b is in the interior of log(S) × {0}, then f−1(b) is obtained topologically by contracting two circles {p1} × S1 and {p2} × S1 on T3 = T2 × S1 to points. These are the familiar conical singularities seen in the special Lagrangian situation.

If b ∈ ∂(log(S) × {0}), then f−1(b) has a slightly more complicated singularity, but only one. Let us examine how the “generic” singular fiber fits in here. In particular, for b in the interior of log(S) × {0}, locally this discriminant locus splits B into two regions, and these regions represent two different possible smoothings of f−1(b).

Assume now that f : X → B is a special Lagrangian fibration with topology and discriminant locus ∆ being an amoeba. Let b ∈ Int(∆), and set M = f−1(b). Set Mo = M\{x1, x2}, where x1, x2 are the two conical singularities of M. Suppose that the tangent cones to these two conical singularities, C1 and C2, are both cones of the form M0. Then the links of these cones, Σ1 and Σ2, are T2’s, and one expects that topologically these can be described as follows. Note that Mo ≅ (T2\{y1, y2}) × S1 where y1, y2 are two points in T2. We assume that the link Σi takes the form γi × S1, where γi is a simple loop around yi. If these assumptions hold, then to see how M can be smoothed, we consider the restriction maps in cohomology

H1(Mo, R) → H11, R) ⊕ H12, R)

The image of this map is two-dimensional. Indeed, if we write a basis ei1, ei2 of H1i, R) where ei1 is Poincaré dual to [γi] × pt and ei2 is Poincaré dual to pt × S1, it is not difficult to see the image of the restriction map is spanned by {(e11, e21)} and {(e12, −e22)}. Now this model of a topological fibration is not special Lagrangian, so in particular we don’t know exactly how the tangent cones to M at x1 and x2 are sitting inside C3, and thus can’t be compared directly with an asymptotically conical smoothing. So to make a plausibility argument, choose new bases fi1, fi2 of H1i, R) so that if M(a,0,0), M(0,a,0) and M(0,0,a) are the three possible smoothings of the two singular tangent cones at the singular points x1, x2 of M. Then Y(Mi(a,0,0)) = πafi1, Y(Mi(0,a,0)) = πafi2, and Y(Mi(0,0,a)) = −πa(fi1 + fi2).

Suppose that in this new basis, the image of the restriction map is spanned by the pairs (f11, rf22) and (rf12, f21) for r > 0, r ≠ 1. Then, there are two possible ways of smoothing M, either by gluing in M1(a,0,0) and M2(0,ra,0) at the singular points x1 and x2 respectively, or by gluing in M1(0,ra,0) and M2(a,0,0) at x1 and x2 respectively. This could correspond to deforming M to a fiber over a point on one side of the discriminant locus of f or the other side. This at least gives a plausibility argument for the existence of a special Lagrangian fibration of the topological type given by f. To date, no such fibrations have been constructed, however.

On giving a special Lagrangian fibration with codimension one discriminant and singular fibers with cone over T2 singularities, one is just forced to confront a codimension one discriminant locus in special Lagrangian fibrations. This leads inevitably to the conclusion that a “strong form” of the Strominger-Yau-Zaslow conjecture cannot hold. In particular, one is forced to conclude that if f : X → B and f’ : X’ → B are dual special Lagrangian fibrations, then their discriminant loci cannot coincide. Thus one cannot hope for a fiberwise definition of the dualizing process, and one needs to refine the concept of dualizing fibrations. Let us see why the discriminant locus must change under dualizing. The key lies in the behaviour of the positive and negative vertices, where in the positive case the critical locus of the local model of the fibration is a union of three holomorphic curves, while in the negative case the critical locus is a pair of pants. In a “generic” special Lagrangian fibration, we expect the critical locus to remain roughly the same, but its image in the base B will be fattened out. In the negative case, this image will be an amoeba. In the case of the positive vertex, the critical locus, at least locally, consists of a union of three holomorphic curves, so that we expect the discriminant locus to be the union of three different amoebas. The figure below shows the new discriminant locus for these two cases.

Untitled

Now, under dualizing, positive and negative vertices are interchanged. Thus the discriminant locus must change. This is all quite speculative, of course, and underlying this is the assumption that the discriminant loci are just fattenings of the graphs. However, it is clear that a new notion of dualizing is necessary to cover this eventuality.

Advertisement

Cryptocurrency and Efficient Market Hypothesis. Drunken Risibility.

According to the traditional definition, a currency has three main properties: (i) it serves as a medium of exchange, (ii) it is used as a unit of account and (iii) it allows to store value. Along economic history, monies were related to political power. In the beginning, coins were minted in precious metals. Therefore, the value of a coin was intrinsically determined by the value of the metal itself. Later, money was printed in paper bank notes, but its value was linked somewhat to a quantity in gold, guarded in the vault of a central bank. Nation states have been using their political power to regulate the use of currencies and impose one currency (usually the one issued by the same nation state) as legal tender for obligations within their territory. In the twentieth century, a major change took place: abandoning gold standard. The detachment of the currencies (specially the US dollar) from the gold standard meant a recognition that the value of a currency (specially in a world of fractional banking) was not related to its content or representation in gold, but to a broader concept as the confidence in the economy in which such currency is based. In this moment, the value of a currency reflects the best judgment about the monetary policy and the “health” of its economy.

In recent years, a new type of currency, a synthetic one, emerged. We name this new type as “synthetic” because it is not the decision of a nation state, nor represents any underlying asset or tangible wealth source. It appears as a new tradable asset resulting from a private agreement and facilitated by the anonymity of internet. Among this synthetic currencies, Bitcoin (BTC) emerges as the most important one, with a market capitalization of a few hundred million short of $80 billions.

bitcoin-price-bitstamp-sept1

Bitcoin Price Chart from Bitstamp

There are other cryptocurrencies, based on blockchain technology, such as Litecoin (LTC), Ethereum (ETH), Ripple (XRP). The website https://coinmarketcap.com/currencies/ counts up to 641 of such monies. However, as we can observe in the figure below, Bitcoin represents 89% of the capitalization of the market of all cryptocurrencies.

Untitled

Cryptocurrencies. Share of market capitalization of each currency.

One open question today is if Bitcoin is in fact a, or may be considered as a, currency. Until now, we cannot observe that Bitcoin fulfills the main properties of a standard currency. It is barely (though increasingly so!) accepted as a medium of exchange (e.g. to buy some products online), it is not used as unit of account (there are no financial statements valued in Bitcoins), and we can hardly believe that, given the great swings in price, anyone can consider Bitcoin as a suitable option to store value. Given these characteristics, Bitcoin could fit as an ideal asset for speculative purposes. There is no underlying asset to relate its value to and there is an open platform to operate round the clock.

Untitled

Bitcoin returns, sampled every 5 hours.

Speculation has a long history and it seems inherent to capitalism. One common feature of speculative assets in history has been the difficulty in valuation. Tulipmania, the South Sea bubble, and more others, reflect on one side human greedy behavior, and on the other side, the difficulty to set an objective value to an asset. All speculative behaviors were reflected in a super-exponential growth of the time series.

Cryptocurrencies can be seen as the libertarian response to central bank failure to manage financial crises, as the one occurred in 2008. Also cryptocurrencies can bypass national restrictions to international transfers, probably at a cheaper cost. Bitcoin was created by a person or group of persons under the pseudonym Satoshi Nakamoto. The discussion of Bitcoin has several perspectives. The computer science perspective deals with the strengths and weaknesses of blockchain technology. In fact, according to R. Ali et. al., the introduction of a “distributed ledger” is the key innovation. Traditional means of payments (e.g. a credit card), rely on a central clearing house that validate operations, acting as “middleman” between buyer and seller. On contrary, the payment validation system of Bitcoin is decentralized. There is a growing army of miners, who put their computer power at disposal of the network, validating transactions by gathering together blocks, adding them to the ledger and forming a ’block chain’. This work is remunerated by giving the miners Bitcoins, what makes (until now) the validating costs cheaper than in a centralized system. The validation is made by solving some kind of algorithm. With the time solving the algorithm becomes harder, since the whole ledger must be validated. Consequently it takes more time to solve it. Contrary to traditional currencies, the total number of Bitcoins to be issued is beforehand fixed: 21 million. In fact, the issuance rate of Bitcoins is expected to diminish over time. According to Laursen and Kyed, validating the public ledger was initially rewarded with 50 Bitcoins, but the protocol foresaw halving this quantity every four years. At the current pace, the maximum number of Bitcoins will be reached in 2140. Taking into account the decentralized character, Bitcoin transactions seem secure. All transactions are recorded in several computer servers around the world. In order to commit fraud, a person should change and validate (simultaneously) several ledgers, which is almost impossible. Additional, ledgers are public, with encrypted identities of parties, making transactions “pseudonymous, not anonymous”. The legal perspective of Bitcoin is fuzzy. Bitcoin is not issued, nor endorsed by a nation state. It is not an illegal substance. As such, its transaction is not regulated.

In particular, given the nonexistence of saving accounts in Bitcoin, and consequently the absense of a Bitcoin interest rate, precludes the idea of studying the price behavior in relation with cash flows generated by Bitcoins. As a consequence, the underlying dynamics of the price signal, finds the Efficient Market Hypothesis as a theoretical framework. The Efficient Market Hypothesis (EMH) is the cornerstone of financial economics. One of the seminal works on the stochastic dynamics of speculative prices is due to L Bachelier, who in his doctoral thesis developed the first mathematical model concerning the behavior of stock prices. The systematic study of informational efficiency begun in the 1960s, when financial economics was born as a new area within economics. The classical definition due to Eugene Fama (Foundations of Finance_ Portfolio Decisions and Securities Prices 1976-06) says that a market is informationally efficient if it “fully reflects all available information”. Therefore, the key element in assessing efficiency is to determine the appropriate set of information that impels prices. Following Efficient Capital Markets, informational efficiency can be divided into three categories: (i) weak efficiency, if prices reflect the information contained in the past series of prices, (ii) semi-strong efficiency, if prices reflect all public information and (iii) strong efficiency, if prices reflect all public and private information. As a corollary of the EMH, one cannot accept the presence of long memory in financial time series, since its existence would allow a riskless profitable trading strategy. If markets are informationally efficient, arbitrage prevent the possibility of such strategies. If we consider the financial market as a dynamical structure, short term memory can exist (to some extent) without contradicting the EMH. In fact, the presence of some mispriced assets is the necessary stimulus for individuals to trade and reach an (almost) arbitrage free situation. However, the presence of long range memory is at odds with the EMH, because it would allow stable trading rules to beat the market.

The presence of long range dependence in financial time series generates a vivid debate. Whereas the presence of short term memory can stimulate investors to exploit small extra returns, making them disappear, long range correlations poses a challenge to the established financial model. As recognized by Ciaian et. al., Bitcoin price is not driven by macro-financial indicators. Consequently a detailed analysis of the underlying dynamics (Hurst exponent) becomes important to understand its emerging behavior. There are several methods (both parametric and non parametric) to calculate the Hurst exponent, which become a mandatory framework to tackle BTC trading.

Production of the Schizoid, End of Capitalism and Laruelle’s Radical Immanence. Note Quote Didactics.

space

These are eclectics of the production, eclectics of the repetition, eclectics of the difference, where the fecundity of the novelty would either spring forth, or be weeded out. There is ‘schizoproduction’ prevalent in the world. This axiomatic schizoproduction is not a speech act, but discursive, in the sense that it constrains how meaning is distilled from relations, without the need for signifying, linguistic acts. Schizoproduction performs the relation. The bare minimum of schizoproduction is the gesture of transcending thought: namely, what François Laruelle calls a ‘decision’. Decision is differential, but it does not have to signify. It is the capacity to produce distinction and separation, in the most minimal, axiomatic form. Schizoproduction is capitalism turned into immanent capitalism, through a gesture of thought – sufficient thought. It is where capitalism has become a philosophy of life, in that it has a firm belief within a sufficient thought, whatever it comes in contact with. It is an expression of the real, the radical immanence as a transcending arrangement. It is a collective articulation bound up with intricate relations and management of carnal, affective, and discursive matter. The present form of capitalism is based on relationships, collaborations, and processuality, and in this is altogether different from the industrial period of modernism in the sense of subjectivity, production, governance, biopolitics and so on. In both cases, the life of a subject is valuable, since it is a substratum of potentiality and capacity, creativity and innovation; and in both cases, a subject is produced with physical, mental, cognitive and affective capacities compatible with each arrangement. Artistic practice is aligned with a shift from modern liberalism to the neoliberal dynamic position of the free agent.

Such attributes have thus become so obvious that the concepts of ‘competence’, ‘trust’ or ‘interest’ are taken as given facts, instead of perceiving them as functions within an arrangement. It is not that neoliberal management has leveraged the world from its joints, but that it is rather capitalism as philosophy, which has produced this world, where neoliberalism is just a part of the philosophy. Therefore, the thought of the end of capitalism will always be speculative, since we may regard the world without capitalism in the same way as we may regard the world-not-for-humans, which may be a speculative one, also. From its inception, capitalism paved a one-way path to annihilation, predicated as it was on unmitigated growth, the extraction of finite resources, the exaltation of individualism over communal ties, and the maximization of profit at the expense of the environment and society. The capitalist world was, as Thurston Clarke described so bleakly, ”dominated by the concerns of trade and Realpolitik rather than by human rights and spreading democracy”; it was a ”civilization influenced by the impersonal, bottom-line values of the corporations.” Capitalist industrial civilization was built on burning the organic remains of ancient organisms, but at the cost of destroying the stable climatic conditions which supported its very construction. The thirst for fossil fuels by our globalized, high-energy economy spurred increased technological development to extract the more difficult-to-reach reserves, but this frantic grasp for what was left only served to hasten the malignant transformation of Earth into an alien world. The ruling class tried to hold things together for as long as they could by printing money, propping up markets, militarizing domestic law enforcement, and orchestrating thinly veiled resource wars in the name of fighting terrorism, but the crisis of capitalism was intertwined with the ecological crisis and could never be solved by those whose jobs and social standing depended on protecting the status quo. All the corporate PR, greenwashing, political promises, cultural myths, and anthropocentrism could not hide the harsh Malthusian reality of ecological overshoot. As crime sky-rocketed and social unrest boiled over into rioting and looting, the elite retreated behind walled fortresses secured by armed guards, but the great unwinding of industrial civilization was already well underway. This evil genie was never going back in the bottle. And thats speculative too, or not really is a nuance to be fought hard on.

The immanence of capitalism is a transcending immanence: a system, which produces a world as an arrangement, through a capitalist form of thought—the philosophy of capitalism—which is a philosophy of sufficient reason in which economy is the determination in the last instance, and not the real. We need to specifically regard that this world is not real. The world is a process, a “geopolitical fiction”. Aside from this reason, there is an unthinkable world that is not for humans. It is not the world in itself, noumena, nor is it nature, bios, but rather it is the world indifferent to and foreclosed from human thought, a foreclosed and radical immanence – the real – which is not open nor will ever be opening itself for human thought. It will forever remain void and unilaterally indifferent. The radical immanence of the real is not an exception – analogous to the miracle in theology – but rather, it is an advent of the unprecedented unknown, where the lonely hour of last instance never comes. This radical immanence does not confer with ‘the new’ or with ‘the same’ and does not transcend through thought. It is matter in absolute movement, into which philosophy or oikonomia incorporates conditions, concepts, and operations. Now, a shift in thought is possible where the determination in the last instance would no longer be economy but rather a radical immanence of the real, as philosopher François Laruelle has argued. What is given, what is radically immanent in and as philosophy, is the mode of transcendental knowledge in which it operates. To know this mode of knowledge, to know it without entering into its circle, is to practice a science of the transcendental, the “transcendental science” of non-philosophy. This science is of the transcendental, but according to Laruelle, it must also itself be transcendental – it must be a global theory of the given-ness of the real. A non- philosophical transcendental is required if philosophy as a whole, including its transcendental structure, is to be received and known as it is. François Laruelle radicalises the Marxist term of determined-in-the-last-instance reworked by Louis Althusser, for whom the last instance as a dominating force was the economy. For Laruelle, the determination-in-the-last-instance is the Real and that “everything philosophy claims to master is in-the-last-instance thinkable from the One-Real”. For Althusser, referring to Engels, the economy is the ‘determination in the last instance’ in the long run, but only concerning the other determinations by the superstructures such as traditions. Following this, the “lonely hour of the ‘last instance’ never comes”.

Hegel and Topos Theory. Thought of the Day 46.0

immagine483037

The intellectual feat of Lawvere is as important as Gödel’s formal undecidability theorem, perhaps even more. But there is a difference between both results: whereas Gödel led to a blind alley, Lawvere has displayed a new and fascinating panorama to be explored by mathematicians and philosophers. Referring to the positive results of topos theory, Lawvere says:

A science student naively enrolling in a course styled “Foundations of Mathematics” is more likely to receive sermons about unknowability… than to receive the needed philosophical guide to a systematic understanding of the concrete richness of pure and applied mathematics as it has been and will be developed. (Categories of space and quantity)

One of the major philosophical results of elementary topos theory, is that the way Hegel looked at logic was, after all, in the good track. According to Hegel, formal mathematical logic was but a superficial tautologous script. True logic was dialectical, and this logic ruled the gigantic process of the development of the Idea. Inasmuch as the Idea was autorealizing itself through the opposition of theses and antitheses, logic was changing but not in an arbitrary change of inferential rules. Briefly, in the dialectical system of Hegel logic was content-dependent.

Now, the fact that every topos has a corresponding internal logic shows that logic is, in quite a precise way, content-dependent; it depends on the structure of the topos. Every topos has its own internal logic, and this logic is materially dependent on the characterization of the topos. This correspondence throws new light on the relation of logic to ontology. Classically, logic was considered as ontologically aseptic. There could be a multitude of different ontologies, but there was only one logic: the classical. Of course, there were some mathematicians that proposed a different logic: the intuitionists. But this proposal was due to not very clear speculative epistemic reasons: they said they could not understand the meaning of the attributive expression “actual infinite”. These mathematicians integrated a minority within the professional mathematical community. They were seen as outsiders that had queer ideas about the exact sciences. However, as soon as intuitionistic logic was recognized as the universal internal logic of topoi, its importance became astronomical. Because it provided, for the first time, a new vision of the interplay of logic with mathematics. Something had definitively changed in the philosophical panorama.

Noneism. Part 2.

nPS6M

Noneism is a very rigourous and original philosophical doctrine, by and large superior to the classical mathematical philosophies. But there are some problems concerning the different ways of characterizing a universe of objects. It is very easy to understand the way a writer characterizes the protagonists of the novels he writes. But what about the characterization of the universe of natural numbers? Since in most kinds of civilizations the natural numbers are characterized the same way, we have the impression that the subject does not intervene in the forging of the characteristics of natural numbers. These numbers appear to be what they are, with total independence of the creative activity of the cognitive subject. There is, of course, the creation of theorems, but the potentially infinite sequence of natural numbers resists any effort to subjectivize its characteristics. It cannot be changed. A noneist might reply that natural numbers are non-existent, that they have no being, and that, in this respect, they are identical with mythological Objects. Moreover, the formal system of natural numbers can be interpreted in many ways: for instance, with respect to a universe of Skolem numbers. This is correct, but it does not explain why the properties of some universes are independent from subjective creation. It is an undeniable fact that there are two kinds of objectual characteristics. On the one hand, we have the characteristics created by subjective imagination or speculative thought; on the other hand, we find some characteristics that are not created by anybody; their corresponding Objects are, in most cases, non-existent but, at the same time, they are not invented. They are just found. The origin of the former characteristics is very easy to understand; the origin of the last ones is, a mystery.

Now, the subject-independence of a universe, suggests that it belongs to a Platonic realm. And as far as transafinite set theory is concerned, the subject-independence of its characteristics is much less evident than the characteristic subject-independence of the natural numbers. In the realm of the finite, both characteristics are subject-independent and can be reduced to combinatorics. The only difference between both is that, according to the classical Platonistic interpretation of mathematics, there can only be a single mathematical universe and that, to deductively study its properties, one can only employ classical logic. But this position is not at all unobjectionable. Once the subject-independence of the natural numbers system’s characteristics is posited, it becomes easy to overstep the classical phobia concerning the possibility of characterizing non-classical objective worlds. Euclidean geometry is incompatible with elliptical and hyperbolic geometries and, nevertheless, the validity of the first one does not invalidate the other ones. And vice versa, the fact that hyperbolic and other kinds of geometry are consistently characterized, does not invalidate the good old Euclidean geometry. And the fact that we have now several kinds of non-Cantorian set theories, does not invalidate the classical Cantorian set theory.

Of course, an universally non-Platonic point of view that includes classical set theory can also be assumed. But concerning natural numbers it would be quite artificial. It is very difficult not to surrender to the famous Kronecker’s dictum: God created natural numbers, men created all the rest. Anyhow, it is not at all absurd to adopt a whole platonistic conception of mathematics. And it is quite licit to adopt a noneist position. But if we do this, the origin of the natural numbers’ characteristics becomes misty. However, forgetting this cloudiness, the leap from noneist universes to the platonistic ones, and vice versa, becomes like a flip-flop connecting objectological with ontological (ideal) universes, like a kind of rabbit-duck Gestalt or a Sherrington staircase. So, the fundamental question with respect to the subject-dependent or subject-independent mathematical theories, is: are they created, or are they found? Regarding some theories, subject-dependency is far more understandable; and concerning other ones, subject-independency is very difficult, if not impossible, to negate.

From an epistemological point of view, the fact of non-subject dependent characteristic traits of a universe would mean that there is something like intellectual intuition. The properties of natural numbers, the finite properties of sets (or combinatorics), some geometric axioms, for instance, in Euclidean geometry, the axioms of betweenness, etc., would be apprehended in a manner, that pretty well coincides with the (nowadays rather discredited) concept of synthetical a priori knowledge. This aspect of mathematical knowledge shows that the old problem concerning the analytic and the a priori synthetical knowledge, in spite of the prevailing Quinean pragmatic conception, must be radically reset.

Anthropocosmism. Thought of the Day 20.0

Anthropocosmic

Russian cosmism appeared as sort of antithesis to the classical physicalist paradigm of thinking that was based on strict a differentiation of man and nature. It made an attempt to revive the ontology of an integral vision that organically unites man and cosmos. These problems were discussed both in the scientific and the religious form of cosmism. In the religious form N. Fedorov’s conception was the most significant one. Like other cosmists, he was not satisfied with the split of the Universe into man and nature as opposed entities. Such an opposition, in his opinion, condemned nature to thoughtlessness and destructiveness, and man to submission to the existing “evil world”. Fedorov maintained the ideas of a unity of man and nature, a connection between “soul” and cosmos in terms of regulation and resurrection. He offered a project of resurrection that was not understood only as a resurrection of ancestors, but contained at least two aspects: raising from the dead in a narrow, direct sense, and in a wider, metaphoric sense that includes nature’s ability of self-reconstruction. Fedorov’s resurrection project was connected with the idea of the human mind’s going to outer space. For him, “the Earth is not bound”, and “human activity cannot be restricted by the limits of the terrestrial planet”, which is only the starting point of this activity. One should critically look at the Utopian and fantastic elements of N. Fedorov’s views, which contain a considerable grain of mysticism, but nevertheless there are important rational moments of his conception: the quite clearly expressed idea of interconnection, the unity of man and cosmos, the idea of the correlation of the rational and moral elements of man, the ideal of the unity of humanity as planetary community of people.

But while religious cosmism was more notable for the fantastic and speculative character of its discourses, the natural scientific trend, solving the problem of interconnection between man and cosmos, paid special attention to the comprehension of scientific achievements that confirmed that interconnection. N. G. Kholodny developed these ideas in terms of anthropocosmism, opposing it to anthropocentrism. He wrote: “Having put himself in the place of God, man destroyed his natural connections with nature and condemned himself to a long solitary existence”. In Kholodny ́s opinion, anthropocentrism passed through several stages in its development: at the first stage man did not oppose himself to nature and did not oppose it, he rather “humanized” the natural forces. At the second stage man, extracting himself from nature, man looks at it as the object for research, the base of his well-being. At the next stage man uplifts himself over nature, basing himself in this activity on spiritual forces he studies the Universe. And, lastly, the next stage is characterized by a crisis of the anthropocentric worldview, which starts to collapse under the influence of the achievements of science and philosophy. N. G. Kholodny was right noting that in the past anthropocentrism had played a positive role; it freed man from his fright at nature by means of uplifting him over the latter. But gradually, beside anthropocentrism there appeared sprouts of the new vision – anthropocosmism. Kholodny regarded anthropocosmism as a certain line of development of the human intellect, will and feelings, which led people to their aims. An essential element in anthropocosmism was the attempt to reconsider the question of man ́s place in nature and of his interrelations with cosmos on the foundation of natural scientific knowledge.

Hyman Minsky, Karl Polanyi, Deleterious Markets and if there is any Alt-Right to them? Apparently No.

Karl Polanyi has often highlighted on the insight about the predicaments of the market. these perils realize that when the markets are left to their own devices, they are enough o cause attrition to the social relations and fabric. However, the social consequences of financial instability can be understood only by employing a Polanyian perspective on how processes of commodification and market expansion jeopardize social institutions. For someone like Hyman Minsky, equilibrium and stability are elusive conditions in markets with debt contracts. His financial instability hypothesis suggests that capitalist economies lead, through their own dynamics, to “the development over historical time of liability structures that cannot be validated by market-determined cash flows or asset values”. According to Minsky, a stable period generates optimistic expectations. Increased confidence and positive expectations of future income streams cause economic actors to decrease margins of safety in their investment decisions. This feeds a surge in economic activity and profits, which turns into a boom as investments are financed by higher degrees of indebtedness. As the economic boom matures, an increasing number of financial intermediaries and firms switch from hedge finance to speculative and Ponzi finance. Minsky argued that economists, misreading Keynes, downplay the role of financial institutions. In particular, he argued that financial innovation can create economic euphoria for a while before destabilizing the economy and hurling it into crises rivaling the Great Depression. Minsky’s insights are evident in the effects of innovations in mortgages and mortgage securities. Actors using speculative and Ponzi finance are vulnerable to macroeconomic volatility and interest rate fluctuations. A boom ends when movements in short-term and long-term interest rates render the liability structures of speculative and Ponzi finance unsustainable. The likelihood of a financial crisis (as opposed to a business cycle) depends on the preponderance of speculative and Ponzi finance in the economy under question.

heres-what-prophetic-economist-hyman-minsky-would-say-about-todays-crisis

Minsky regularly criticized economists for failing to grasp Keynes’s ideas. In his book Stabilizing an Unstable Economy Minsky argued that while economists assimilated some of Keynes’s insights into standard economic theory, they failed to grasp the connection between the financial and real sectors. Specifically, he argued that finance is missing from macroeconomic theory, with its focus on capital structure, asset-liability management, agency theory, and contracts. He wrote:

Keynes’s theory revolves around bankers and businessmen making deals on Wall Street … One of the peculiarities of the neoclassical theory that preceded Keynes and the neoclassical synthesis that now predominates economic theory is that neither allows the activities that take place on Wall Street to have any significant impact upon the coordination or lack of coordination of the economy…

Minsky’s work on financial crises builds on Keynes’s insights, using terms such as “euphoric economy”, and “unrealistic euphoric expectations with respect to costs, markets, and their development over time”. Yet Minsky considered the issues of rational prices and market efficiency as only the tip of an iceberg. His broad framework addresses issues related to the lending practices by financial institutions, central bank policy, fiscal policy, the efficacy of financial market regulation, employment policy, and income distribution. Financial institutions, such as banks, become increasingly innovative in their use of financial products when the business cycle expands, boosting their leverage and funding projects with ever increasing risk. Minsky’s words on financial innovation are striking, as if foretelling the recent crisis.

Over an expansion, new financial instruments and new ways of financing activity develop. Typically, defects of the new ways and the new institutions are revealed when the crunch comes.

Commercial banks sponsored conduits to finance long-term assets through special purpose entities such as structured investment vehicles (SIVs), something similar to the Indian version of Special Purpose Vehicles (SPVs). These were off balance sheet entities, subjecting them to lower regulatory capital requirements. Special purpose entities used commercial paper to raise funds they then used to buy mortgages and mortgage securities. In effect, banks relied on Minsky-type speculative and Ponzi financing, borrowing short-term and using these borrowed funds to buy long-term assets. Wrote Minsky,

The standard analysis of banking has led to a game that is played by central banks, henceforth to be called the authorities, and profit-seeking banks. In this game, the authorities impose interest rates and reserve regulations and operate in money markets to get what they consider to be the right amount of money, and the banks invent and innovate in order to circumvent the authorities. The authorities may constrain the rate of growth of the reserve base, but the banking and financial structure determines the efficacy of reserves…This is an unfair game. The entrepreneurs of the banking community have much more at stake than the bureaucrats of the central banks. In the postwar period, the initiative has been with the banking community, and the authorities have been “surprised” by changes in the way financial markets operate. The profit-seeking bankers almost always win their game with the authorities, but, in winning, the banking community destabilizes the economy; the true losers are those who are hurt by unemployment and inflation.

1430246264044

Combining Hyman Minsky’s insights on financial fragility with a Polanyian focus on commodification offers a distinct perspective on the causes and consequences of the foreclosure crisis. First, following Polanyi, we should expect to find commodity fiction applied to arenas of social life previously isolated from markets to be at the heart of the recent financial crisis. Second, following Minsky, the transformations caused by novel uses of commodity fiction should be among the primary causes of financial fragility. Finally, in line with a Polanyian focus on the effects of supply-demand-price mechanism, the price fluctuations caused by financial fragility should disrupt existing social relations and institutions in a significant manner. So, how does this all peter down to alt-right? Right-wing libertarianism is basically impossible. The “free” market as we know it today needs the state to be implemented – without reading Polanyi, you just know for example that without the force of the state, you just can’t have private property or all the legal arrangements that underpin property, labour and money. So it wouldn’t work anyway. Polanyi’s point is that if we want democracy to survive, we need to beware of financial overlords and their ideological allies peddling free-market utopias. And if democracy even stinks of stability, then stability is destabilizing as Minsky would have had it, thus corroborating the cross-purposes between the two thinkers in question, at least to the point of a beginning.