The New Lexicon of Hate

image

One reason why ‘cosmopolitan’ is an unnerving term is that it was the key to an attempt by Soviet dictator Josef Stalin to purge the culture of dissident voices. In a 1946 speech, he deplored works in which ‘the positive Soviet hero is derided and inferior before all things foreign and cosmopolitanism that we all fought against from the time of Lenin, characteristic of the political leftovers, is many times applauded.’ It was part of a yearslong [sic] campaigned aimed at writers, theater critics, scientists and others who were connected with ‘bourgeois Western influences.’ Not so incidentally, many of these ‘cosmopolitans’ were Jewish, and official Soviet propaganda for a time devoted significant energy into ‘unmasking’ the Jewish identities of writers who published under pseudonyms.

Something is rotten with liberalism’s reigning manifestation, its stench discernible to everyone but itself. A sterile managerialism – signposted as what Oscar Wilde decried as “the monstrous worship of facts” – distilled in the form of policy wonkery and modish Vox explainers, had the rug yanked from under it on Nov. 8. It was an unexpected stumble across the Rubicon – one in which the ruling consensus was forsaken, crestfallen, and discombobulated within a ruptured sociopolitical milieu that was no longer recognizable.

Donald Trump is the expression of the id, animated by libidinal whims, repressed desires, and resentments; the liberal establishment was the moralizing superego, directing commands toward appropriate conduct and policing discourse. Upon losing control of the id, the compulsion to fact-check and bellow “This is not normal!” into the post-truth abyss turned liberals, Rensin proclaims, into “the blathering superego at the end of history.”

In this political order, transgression and libertinism appeared as cathartic outlets. Irony was weaponized, and guileful wordplay camouflaged bigotry. Such was the transgressive thrill of Trumpism: the enjoyment of publicly stating what is not said openly, which tapped into what Jacques Lacan termed jouissance – the desire to go beyond the limits of publicly accepted discourse. Unsurprisingly, the shift toward social sadism is echoed in online culture, especially with trolling. The so-called alt-right embraced trolling, shrugging off accusations of racism and sexism by adopting a sardonic dispensation to wring its hands clean from charges of prejudice. “You just don’t get it,” went the customary rebuke. They know their liberal opponents well, homing in on their conscience and sanctimonious virtue-signaling. Witch-hunting and online harassment is employed as a popular strategy to hound feminists, social justice warriors, and other moralists. Equivalent disdain is reserved for establishment conservatives, branded “cuckservatives” for having stood as the positional gains of minorities emasculated White America.

There is an inclination to reduce the alt-right’s pranksterism to a pop-cultural spectacle, as opposed to a crucible of virulent ethno-nationalism that needs to be confronted and refuted. While the profusion of irony, memes, and in-jokes does not a movement make, it is important to eschew the revulsion that characterizes much of the response to this nebulous amalgam.

Conservatism, after all, can summon a radical undercurrent when necessary. Fundamentally reactionary as opposed to rigidly traditionalist, it is willing to absorb and redirect the potency of new revolutionary actors toward counter-revolution and new relations of domination. Political scientist Corey Robin identifies this tendency in “The Reactionary Mind_ Conservatism from Edmund Burke to Sarah Palin” where he points out that the right is more than happy to violently upend an anemic ruling class to install a more dynamic one in its place, even if it means using the tactics and rhetoric of their ideological rivals. As Robin notes, “While conservatives are hostile to the goals of the left . . . they often are the left’s best students.”

Reductionism of Numerical Complexity: A Wittgensteinian Excursion

boyle10

Wittgenstein’s criticism of Russell’s logicist foundation of mathematics contained in (Remarks on the Foundation of Mathematics) consists in saying that it is not the formalized version of mathematical deduction which vouches for the validity of the intuitive version but conversely.

If someone tries to shew that mathematics is not logic, what is he trying to shew? He is surely trying to say something like: If tables, chairs, cupboards, etc. are swathed in enough paper, certainly they will look spherical in the end.

He is not trying to shew that it is impossible that, for every mathematical proof, a Russellian proof can be constructed which (somehow) ‘corresponds’ to it, but rather that the acceptance of such a correspondence does not lean on logic.

Taking up Wittgenstein’s criticism, Hao Wang (Computation, Logic, Philosophy) discusses the view that mathematics “is” axiomatic set theory as one of several possible answers to the question “What is mathematics?”. Wang points out that this view is epistemologically worthless, at least as far as the task of understanding the feature of cognition guiding is concerned:

Mathematics is axiomatic set theory. In a definite sense, all mathematics can be derived from axiomatic set theory. [ . . . ] There are several objections to this identification. [ . . . ] This view leaves unexplained why, of all the possible consequences of set theory, we select only those which happen to be our mathematics today, and why certain mathematical concepts are more interesting than others. It does not help to give us an intuitive grasp of mathematics such as that possessed by a powerful mathematician. By burying, e.g., the individuality of natural numbers, it seeks to explain the more basic and the clearer by the more obscure. It is a little analogous to asserting that all physical objects, such as tables, chairs, etc., are spherical if we swathe them with enough stuff.

Reductionism is an age-old project; a close forerunner of its incarnation in set theory was the arithmetization program of the 19th century. It is interesting that one of its prominent representatives, Richard Dedekind (Essays on the Theory of Numbers), exhibited a quite distanced attitude towards a consequent carrying out of the program:

It appears as something self-evident and not new that every theorem of algebra and higher analysis, no matter how remote, can be expressed as a theorem about natural numbers [ . . . ] But I see nothing meritorious [ . . . ] in actually performing this wearisome circumlocution and insisting on the use and recognition of no other than rational numbers.

Perec wrote a detective novel without using the letter ‘e’ (La disparition, English A void), thus proving not only that such an enormous enterprise is indeed possible but also that formal constraints sometimes have great aesthetic appeal. The translation of mathematical propositions into a poorer linguistic framework can easily be compared with such painful lipogrammatical exercises. In principle all logical connectives can be simulated in a framework exclusively using Sheffer’s stroke, and all cuts (in Gentzen’s sense) can be eliminated; one can do without common language at all in mathematics and formalize everything and so on: in principle, one could leave out a whole lot of things. However, in doing so one would depart from the true way of thinking employed by the mathematician (who really uses “and” and “not” and cuts and who does not reduce many things to formal systems). Obviously, it is the proof theorist as a working mathematician who is interested in things like the reduction to Sheffer’s stroke since they allow for more concise proofs by induction in the analysis of a logical calculus. Hence this proof theorist has much the same motives as a mathematician working on other problems who avoids a completely formalized treatment of these problems since he is not interested in the proof-theoretical aspect.

There might be quite similar reasons for the interest of some set theorists in expressing usual mathematical constructions exclusively with the expressive means of ZF (i.e., in terms of ∈). But beyond this, is there any philosophical interpretation of such a reduction? In the last analysis, mathematicians always transform (and that means: change) their objects of study in order to make them accessible to certain mathematical treatments. If one considers a mathematical concept as a tool, one does not only use it in a way different from the one in which it would be used if it were considered as an object; moreover, in semiotical representation of it, it is given a form which is different in both cases. In this sense, the proof theorist has to “change” the mathematical proof (which is his or her object of study to be treated with mathematical tools). When stating that something is used as object or as tool, we have always to ask: in which situation, or: by whom.

A second observation is that the translation of propositional formulæ in terms of Sheffer’s stroke in general yields quite complicated new formulæ. What is “simple” here is the particularly small number of symbols needed; but neither the semantics becomes clearer (p|q means “not both p and q”; cognitively, this looks more complex than “p and q” and so on), nor are the formulæ you get “short”. What is looked for in this case, hence, is a reduction of numerical complexity, while the primitive basis attained by the reduction cognitively looks less “natural” than the original situation (or, as Peirce expressed it, “the consciousness in the determined cognition is more lively than in the cognition which determines it”); similarly in the case of cut elimination. In contrast to this, many philosophers are convinced that the primitive basis of operating with sets constitutes really a “natural” basis of mathematical thinking, i.e., such operations are seen as the “standard bricks” of which this thinking is actually made – while no one will reasonably claim that expressions of the type p|q play a similar role for propositional logic. And yet: reduction to set theory does not really have the task of “explanation”. It is true, one thus reduces propositions about “complex” objects to propositions about “simple” objects; the propositions themselves, however, thus become in general more complex. Couched in Fregean terms, one can perhaps more easily grasp their denotation (since the denotation of a proposition is its truth value) but not their meaning. A more involved conceptual framework, however, might lead to simpler propositions (and in most cases has actually just been introduced in order to do so). A parallel argument concerns deductions: in its totality, a deduction becomes more complex (and less intelligible) by a decomposition into elementary steps.

Now, it will be subject to discussion whether in the case of some set operations it is admissible at all to claim that they are basic for thinking (which is certainly true in the case of the connectives of propositional logic). It is perfectly possible that the common sense which organizes the acceptance of certain operations as a natural basis relies on something different, not having the character of some eternal laws of thought: it relies on training.

Is it possible to observe that a surface is coloured red and blue; and not to observe that it is red? Imagine a kind of colour adjective were used for things that are half red and half blue: they are said to be ‘bu’. Now might not someone to be trained to observe whether something is bu; and not to observe whether it is also red? Such a man would then only know how to report: “bu” or “not bu”. And from the first report we could draw the conclusion that the thing was partly red.