Rhizomatic Extreme-Right.


In the context of the extreme right-wing politics in the contemporary age, groupuscules can be defined as numerically negligible political, frequently meta-political, but never party-political entities formed to pursue palingenetic ideological, organizational or activistic ends with an ultimate goal of overcoming the decadence of the liberal-democratic system. Though, they are fully formed and autonomous, they have small active memberships and minimal, if any public visibility or support, which is now inflating. Yet they acquire enhanced influence and significance through the ease with which they can be associated, even if only in the minds of political extremists, with other group lets which are sufficiently aligned ideologically and tactically to complement each other’s activities in their bid to institute a new type of society. As a result the groupuscule has Janus-headed characteristic of combining organizational autonomy with the ability to create informal linkages with, or reinforce the influence of other such formations. This enables groupuscules, when considered in terms of their aggregate impact on politics and society, to be seen as forming a non-hierarchical, leaderless and centreless, or rather polycentric movement with fluid boundaries and constantly changing components. This groupuscular right has the characteristics of a political and ideological subculture rather than a conventional political party movement, and is perfectly adapted to the task of perpetuating revolutionary extremism in an age of relative political stability.

The outstanding contrast between the groupuscular and party-political organization of the extreme right is that instead of being formed into a tree-like hierarchical organisms it is now rhizomatic. The use of the term was pioneered in the spirit of post-structuralist radicalism by Deleuze and Guattari to help conceptualize the social phenomena to which, metaphorically at least, the attributes of supra-personal organic life-forms can be ascribed, but which are not structured in a coherently hierarchical or systematically interconnected way which would make tree-based or dendroid metaphors appropriate. When applied to groupuscular right the concept of rhizome throws itself into relief its dynamic nature as a polycentric, leaderless movement by stressing that it does not operate like a single organism such as a tree with a tap-root, branch and canopy, and a well-defined beginning and an end. Instead, it behaves like the root-system of some species of grass or tuber, displaying multiple starts and beginnings which intertwine and connect with each other, constantly producing new shoots as others die off in an unpredictable, asymmetrical pattern of growth and decay. If a political network has a rhizomes political structure it means that it forms a cellular, capillary network with ill-defined boundaries and no formal hierarchy or internal organizational structure to give it a unified intelligence. Thanks to its rhizomic structure the groupuscular right no longer emulates a singular living organism, as the slime-mould is so mysteriously capable of doing. Nor is it to be seen as made up of countless tiny, disconnected micro-organisms. Instead, following an internal dynamic which only the most advanced life sciences can model with any clarity, the minute bursts of spontaneous creativity which produce and maintain individual groupuscules constitute nodal points in a force-field or web of radical political energy which fuels the vitality and viability of the organism as a whole. These qualities duplicate the very features of the Internet for making it impossible to shut down or wipe out the information it contains simply by knocking out any one part of it, since there is no mission control to destroy. The groupuscularity of the contemporary extreme right makes it eminently able to survive and grow even if some of the individual organizations which constitute it are banned and their websites closed down.

From Slime Mould to Rhizome


Gothic: Once Again Atheistic Materialism and Hedonistic Flirtations. Drunken Risibility.



The machinery of the Gothic, traditionally relegated to both a formulaic and a sensational aesthetic, gradually evolved into a recyclable set of images, motifs and narrative devices that surpass temporal, spatial and generic categories. From the moment of its appearance the Gothic has been obsessed with presenting itself as an imitation.

Recent literary theory has extensively probed into the power of the Gothic to evade temporal and generic limits and into the aesthetic, narratological and ideological implications this involves. Officially granting the Gothic the elasticity it has always entailed has resulted in a reconfiguration of its spectrum both synchronically – by acknowledging its influence on numerous postmodern fictions – and diachronically – by rescripting, in hindsight, the history of its canon so as to allow space for ambiguous presences.

Both transgressive and hybrid in form and content, the Gothic has been accepted as a malleable genre, flexible enough to create more freely, in Borgesian fashion, its own precursors. The genre flouted what are considered the basic principles of good prose writing: adherence to verisimilitude and avoidance of both narrative diversions and moralising – all of which are, of course, made to be deliberately upset. Many merely cite the epigrammatic power of the essay’s most renowned phrase, that the rise of the Gothic “was the inevitable result of the revolutionary shocks which all of Europe has suffered”.

The eighteenth-century French materialist philosophy purported the displacement of metaphysical investigations into the meaning of life by materialist explorations. Julien Offray de La Mettrie, a French physician and philosopher, the earliest of materialist writers of the Enlightenment, published the materialist manifesto L’ Homme machine (Man a Machine), that did away with the transcendentalism of the soul, banished all supernatural agencies by claiming that mind is as mechanical as matter and equated humans with machines. In his words: “The human body is a machine that winds up its own springs: it is a living image of the perpetual motion”. French materialist thought resulted in the publication of the great 28-volume Encyclopédie, ou Dictionnaire raisonné des sciences, des arts et des méttrie par une société de gens de lettres by Denis Diderot and Jean Le Rond d’ Alembert, and which was grounded on purely materialist principles, against all kinds of metaphysical thinking. Diderot’s atheist materialism set the tone of the Encyclopédie, which, for both editors, was the ideal vehicle […] for reshaping French high culture and attitudes, as well as the perfect instrument with which to insinuate their radical Weltanschauung surreptitiously, using devious procedures, into the main arteries of French Society, embedding their revolutionary philosophic manifesto in a vast compilation ostensibly designed to provide plain information and basic orientation but in fact subtly challenging and transforming attitudes in every respect. While materialist thinkers ultimately disowned La Mettrie because he ran counter to their systematic moral, political and social naturalism, someone like Sade remained deeply influenced and inspired for his indebtedness to La Mettrie’s atheism and hedonism, particularly to the perception of virtue and vice as relative notions − the result of socialisation and at odds with nature.


NVSQVAM (Nowhere): Left has Hemorrhaged its Mojo


Left-liberal attitudes and habits of mind may at one time have been radical, provocative, and gutsy, but today they are staid, stale, conventional, and boring. Any honest contemporary cultural Marxist will have to admit that, politically speaking, his side now holds all significant power. Those who openly decline to subscribe to the ideological establishment’s point of view on such matters as race, gender, and sexuality have in effect committed social suicide; having put themselves utterly at the mercy of the powers-that-be, such unfortunates have left themselves open to attack by legions of official Zeitgeist-enforcers and their numerous toadying minions.

Today’s thought-criminals and ideological deviants are liable to be thrown in jail or fined for indulging in so-called “hate speech,” or at the very least, to be subjected to harassment, humiliation, and deprivation of livelihood. It is, in short, a bad career move not to toe the company line. Even in a country where free expression is nominally protected, one still in actuality faces a stark choice: conform to the enforced conventional wisdom, or be thrust into the outer darkness.

For radical traditionalists, alternative rightists, race realists, and other such present-day thought-criminals, things seem dire indeed. Yet all is not lost, and much, in fact, has been won. For our adversaries’ victory on cultural matters is very much a pyrrhic one. In becoming the Establishment, the Left has hemorrhaged its mojo. To be a lefty today has none of the allure or glamour that it once possessed in halcyon times when one actually faced persecution and ostracism for taking up left-wing causes. One who spouts liberal rhetoric and parrots politically-correct bromides doesn’t seem like a troublemaker, but rather a brown-nosing goody-goody. A defiant rightist, on the other hand, has gained the status of a dangerous outlaw; though reviled, feared, and loathed by the authority-fearing populace, such a one nevertheless exudes an exciting primordial appeal for his insolent refusal to curtsy before the almighty Zeitgeist. There is more of Andy Nowicki to come

Categorial Logic – Paracompleteness versus Paraconsistency. Thought of the Day 46.2


The fact that logic is content-dependent opens a new horizon concerning the relationship of logic to ontology (or objectology). Although the classical concepts of a priori and a posteriori propositions (or judgments) has lately become rather blurred, there is an undeniable fact: it is certain that the far origin of mathematics is based on empirical practical knowledge, but nobody can claim that higher mathematics is empirical.

Thanks to category theory, it is an established fact that some sort of very important logical systems: the classical and the intuitionistic (with all its axiomatically enriched subsystems), can be interpreted through topoi. And these possibility permits to consider topoi, be it in a Noneist or in a Platonist way, as universes, that is, as ontologies or as objectologies. Now, the association of a topos with its correspondent ontology (or objectology) is quite different from the association of theoretical terms with empirical concepts. Within the frame of a physical theory, if a new fact is discovered in the laboratory, it must be explained through logical deduction (with the due initial conditions and some other details). If a logical conclusion is inferred from the fundamental hypotheses, it must be corroborated through empirical observation. And if the corroboration fails, the theory must be readjusted or even rejected.

In the case of categorial logic, the situation has some similarity with the former case; but we must be careful not to be influenced by apparent coincidences. If we add, as an axiom, the tertium non datur to the formalized intuitionistic logic we obtain classical logic. That is, we can formally pass from the one to the other, just by adding or suppressing the tertium. This fact could induce us to think that, just as in physics, if a logical theory, let’s say, intuitionistic logic, cannot include a true proposition, then its axioms must be readjusted, to make it possible to include it among its theorems. But there is a radical difference: in the semantics of intuitionistic logic, and of any logic, the point of departure is not a set of hypothetical propositions that must be corroborated through experiment; it is a set of propositions that are true under some interpretation. This set can be axiomatic or it can consist in rules of inference, but the theorems of the system are not submitted to verification. The derived propositions are just true, and nothing more. The logician surely tries to find new true propositions but, when they are found (through some effective method, that can be intuitive, as it is in Gödel’s theorem) there are only three possible cases: they can be formally derivable, they can be formally underivable, they can be formally neither derivable nor underivable, that is, undecidable. But undecidability does not induce the logician to readjust or to reject the theory. Nobody tries to add axioms or to diminish them. In physics, when we are handling a theory T, and a new describable phenomenon is found that cannot be deduced from the axioms (plus initial or some other conditions), T must be readjusted or even rejected. A classical logician will never think of changing the axioms or rules of inference of classical logic because it is undecidable. And an intuitionist logician would not care at all to add the tertium to the axioms of Heyting’s system because it cannot be derived within it.

The foregoing considerations sufficiently show that in logic and mathematics there is something that, with full right, can be called “a priori“. And although, as we have said, we must acknowledge that the concepts of a priori and a posteriori are not clear-cut, in some cases, we can rightly speak of synthetical a priori knowledge. For instance, the Gödel’s proposition that affirms its own underivabilty is synthetical and a priori. But there are other propositions, for instance, mathematical induction, that can also be considered as synthetical and a priori. And a great deal of mathematical definitions, that are not abbreviations, are synthetical. For instance, the definition of a monoid action is synthetical (and, of course, a priori) because the concept of a monoid does not have among its characterizing traits the concept of an action, and vice versa.

Categorial logic is, the deepest knowledge of logic that has ever been achieved. But its scope does not encompass the whole field of logic. There are other kinds of logic that are also important and, if we intend to know, as much as possible, what logic is and how it is related to mathematics and ontology (or objectology), we must pay attention to them. From a mathematical and a philosophical point of view, the most important logical non-paracomplete systems are the paraconsistent ones. These systems are something like a dual to paracomplete logics. They are employed in inconsistent theories without producing triviality (in this sense also relevant logics are paraconsistent). In intuitionist logic there are interpretations that, with respect to some topoi, include two false contradictory propositions; whereas in paraconsistent systems we can find interpretations in which there are two contradictory true propositions.

There is, though, a difference between paracompleteness and paraconsistency. Insofar as mathematics is concerned, paracomplete systems had to be coined to cope with very deep problems. The paraconsistent ones, on the other hand, although they have been applied with success to mathematical theories, were conceived for purely philosophical and, in some cases, even for political and ideological motivations. The common point of them all was the need to construe a logical system able to cope with contradictions. That means: to have at one’s disposal a deductive method which offered the possibility of deducing consistent conclusions from inconsistent premisses. Of course, the inconsistency of the premisses had to comply with some (although very wide) conditions to avoid triviality. But these conditions made it possible to cope with paradoxes or antinomies with precision and mathematical sense.

But, philosophically, paraconsistent logic has another very important property: it is used in a spontaneous way to formalize the naive set theory, that is, the kind of theory that pre-Zermelian mathematicians had always employed. And it is, no doubt, important to try to develop mathematics within the frame of naive, spontaneous, mathematical thought, without falling into the artificiality of modern set theory. The formalization of the naive way of mathematical thinking, although every formalization is unavoidably artificial, has opened the possibility of coping with dialectical thought.

Ejecting and Injecting Pronouns in Capitalism. Drunken Risibility.


I want to de-agentify myself. It is not ‘you’, or ‘I’, or ‘me’. It is the questioning of such pronouns. And why are they required at all. As I have always maintained, agencies would disappear in capitalism. And when we have capitalism everywhere, we would cease to be “we”. “We” would become part of the “IT”. And this is emancipation of the highest order. Teleology would be replaced by Eschatology. Alienation would be replaced with emancipation. Teleology is alienating, whereas eschatology is emancipating. Agency would become un-agency. An emancipation from alienation, from being, into the arms of becoming, for the former is a mere snapshot of the illusory order, whereas the latter is a continuum of fluidity, the fluid dynamics of deracinated from the illusory order. The “IT” is pure and brute materialism, the cosmic unfoldings beyond our understanding and importantly mirrored in on the terrestrial. “IT” is not to be realized. “It” is what engulfs us, kills us, and in the process emancipates us from alienation. “IT” is “Realism”, a philosophy without “we”, Capitalism’s excessive power. “IT” enslaves “us” to the point of us losing any identification. Capital is the fluidity of encampment. Ideologizing it is capitalism. This fluid is hotter than molten lava and colder than ice-caps. This fluidity does not believe in layers, and to that extent is democratic. Capital is the flow of desires/passions/reasons that criss-cross with the material that go on to build them. Capital is abstract only as far as speculations are concrete. Both morph into each other.

Representation as a Meaningful Philosophical Quandary


The deliberation on representation indeed becomes a meaningful quandary, if most of the shortcomings are to be overcome, without actually accepting the way they permeate the scientific and philosophical discourse. The problem is more ideological than one could have imagined, since, it is only within the space of this quandary that one can assume success in overthrowing the quandary. Unless the classical theory of representation that guides the expert systems has been accepted as existing, there is no way to dislodge the relationship of symbols and meanings that build up such systems, lest the predicament of falling prey to the Scylla of metaphysically strong notion of meaningful representation as natural or the Charybdis of an external designer should gobble us up. If one somehow escapes these maliciously aporetic entities, representation as a metaphysical monster stands to block our progress. Is it really viable then to think of machines that can survive this representational foe, a foe that gets no aid from the clusters of internal mechanisms? The answer is very much in the affirmative, provided, a consideration of the sort of such a non-representational system as continuous and homogeneous is done away with. And in its place is had functional units that are no more representational ones, for the former derive their efficiency and legitimacy through autopoiesis. What is required is to consider this notional representational critique of distributed systems on the objectivity of science, since objectivity as a property of science has an intrinsic value of independence from the subject who studies the discipline. Kuhn  had some philosophical problems to this precise way of treating science as an objective discipline. For Kuhn, scientists operate under or within paradigms thus obligating hierarchical structures. Such hierarchical structures ensure the position of scientists to voice their authority on matters of dispute, and when there is a crisis within, or, for the paradigm, scientists, to begin with, do not outrightly reject the paradigm, but try their level best at resolution of the same. In cases where resolution becomes a difficult task, an outright rejection of the paradigm would follow suit, thus effecting what is commonly called the paradigm shift. If such were the case, obviously, the objective tag for science goes for a hit, and Kuhn argues in favor of a shift in social order that science undergoes, signifying the subjective element. Importantly, these paradigm shifts occur to benefit scientific progress and in almost all of the cases, occur non-linearly. Such a view no doubt slides Kuhn into a position of relativism, and has been the main point of attack on paradigms shifting. At the forefront of attacks has been Michael Polanyi and his bunch of supporters, whose work on epistemology of science have much of the same ingredients, but was eventually deprived of fame. Kuhn was charged with plagiarism. The commonality of their arguments could be measured by a dissenting voice for objectivity in science. Polanyi thought of it as a false ideal, since for him the epistemological claims that defined science were based more on personal judgments, and therefore susceptible to fallibilism. The objective nature of science that obligates the scientists to see things as they really are is kind of dislodged by the above principle of subjectivity. But, if science were to be seen as objective, then the human subjectivity would indeed create a rupture as far as the purified version of scientific objectivity is sought for. The subject or the observer undergoes what is termed the “observer effect” that refers to the change impacting an act of observation being observed. This effect is as good as ubiquitous in most of the domains of science and technology ranging from Heisenbug(1) in computing via particle physics, science of thermodynamics to quantum mechanics. The quantum mechanics observer effect is quite perplexing, and is a result of a phenomenon called “superposition” that signifies the existence in all possible states and all at once. The superposition gets its credit due to Schrödinger’s cat experiment. The experiment entails a cat that is neither dead nor alive until observed. This has led physicists to take into account the acts of “observation” and “measurement” to comprehend the paradox in question, and thereby come out resolving it. But there is still a minority of quantum physicists out there who vouch for the supremacy of an observer, despite the quantum entanglement effect that go on to explain “observation” and “measurement” impacts.(2) Such a standpoint is indeed reflected in Derrida (9-10) as well, when he says (I quote him in full),

The modern dominance of the principle of reason had to go hand in hand with the interpretation of the essence of beings as objects, and object present as representation (Vorstellung), an object placed and positioned before a subject. This latter, a man who says ‘I’, an ego certain of itself, thus ensures his own technical mastery over the totality of what is. The ‘re-‘ of repraesentation also expresses the movement that accounts for – ‘renders reason to’ – a thing whose presence is encountered by rendering it present, by bringing it to the subject of representation, to the knowing self.

If Derridean deconstruction needs to work on science and theory, the only way out is to relinquish the boundaries that define or divide the two disciplines. Moreover, if there is any looseness encountered in objectivity, the ramifications are felt straight at the levels of scientific activities. Even theory does not remain immune to these consequences. Importantly, as scientific objectivity starts to wane, a corresponding philosophical luxury of avoiding the contingent wanes. Such a loss of representation congruent with a certain theory of meaning we live by has serious ethical-political affectations.

(1) Heisenbug is a pun on the Heisenberg’s uncertainty principle and is a bug in computing that is characterized by a disappearance of the bug itself when an attempt is made to study it. One common example is a bug that occurs in a program that was compiled with an optimizing compiler, but not in the same program when compiled without optimization (e.g., for generating a debug-mode version). Another example is a bug caused by a race condition. A heisenbug may also appear in a system that does not conform to the command-query separation design guideline, since a routine called more than once could return different values each time, generating hard- to-reproduce bugs in a race condition scenario. One common reason for heisenbug-like behaviour is that executing a program in debug mode often cleans memory before the program starts, and forces variables onto stack locations, instead of keeping them in registers. These differences in execution can alter the effect of bugs involving out-of-bounds member access, incorrect assumptions about the initial contents of memory, or floating- point comparisons (for instance, when a floating-point variable in a 32-bit stack location is compared to one in an 80-bit register). Another reason is that debuggers commonly provide watches or other user interfaces that cause additional code (such as property accessors) to be executed, which can, in turn, change the state of the program. Yet another reason is a fandango on core, the effect of a pointer running out of bounds. In C++, many heisenbugs are caused by uninitialized variables. Another similar pun intended bug encountered in computing is the Schrödinbug. A schrödinbug is a bug that manifests only after someone reading source code or using the program in an unusual way notices that it never should have worked in the first place, at which point the program promptly stops working for everybody until fixed. The Jargon File adds: “Though… this sounds impossible, it happens; some programs have harbored latent schrödinbugs for years.”

(2) There is a related issue in quantum mechanics relating to whether systems have pre-existing – prior to measurement, that is – properties corresponding to all measurements that could possibly be made on them. The assumption that they do is often referred to as “realism” in the literature, although it has been argued that the word “realism” is being used in a more restricted sense than philosophical realism. A recent experiment in the realm of quantum physics has been quoted as meaning that we have to “say goodbye” to realism, although the author of the paper states only that “we would [..] have to give up certain intuitive features of realism”. These experiments demonstrate a puzzling relationship between the act of measurement and the system being measured, although it is clear from experiment that an “observer” consisting of a single electron is sufficient – the observer need not be a conscious observer. Also, note that Bell’s Theorem suggests strongly that the idea that the state of a system exists independently of its observer may be false. Note that the special role given to observation (the claim that it affects the system being observed, regardless of the specific method used for observation) is a defining feature of the Copenhagen Interpretation of quantum mechanics. Other interpretations resolve the apparent paradoxes from experimental results in other ways. For instance, the Many- Worlds Interpretation posits the existence of multiple universes in which an observed system displays all possible states to all possible observers. In this model, observation of a system does not change the behavior of the system – it simply answers the question of which universe(s) the observer(s) is(are) located in: In some universes the observer would observe one result from one state of the system, and in others the observer would observe a different result from a different state of the system.