Canonical Fibrations on Geodesics

conjA5

There is a realisation of the canonical fibrations of flag manifolds that serves to introduce a twistor space. For this, assume that G is of adjoint type (i.e. has trivial centre) and let ΩG denote the infinite-dimensional manifold of based loops in G: the loop group. In fact ΩG is a Kähler manifold and may be viewed as a flag manifold GC/P where GC is the manifold of loops in GC and P is the subgroup of those that extend holomorphically to the disc. We have various fibrations ρλ: ΩG → G given by evaluation at λ ∈ S1 and in some ways ρ−1 behaves like a canonical fibration making ΩG into a universal twistor space for G. It is a theorem of Uhlenbeck that any harmonic map of S2 into G is of the form ρ−1 ◦ Φ for some “super-horzontal” holomorphic map Φ : S2 → ΩG.

The flag manifolds of G embed in ΩG as conjugacy classes of geodesics and we find a particular embedding of this kind using the canonical element. Indeed, our assumption that G be centre-free means that exp 2πξ = e for any canonical element ξ. Thus if F = G/H = GC/P is a flag manifold with ξ the canonical element of p, we may define a map Γ: F → ΩG by setting

Γ(eH) = (e√−1t → exp tξ)

and extending by equivariance. Moreover, if N is the inner symmetric space associated to F, we have a totally geodesic immersion γ : N → G defined by setting γ(x) equal to the element of G that generates the involution at x. We now have:

Γ: F → ΩG is a totally geodesic, holomorphic, isometric immersion and the following diagram commutes

Untitled

where π1 is a canonical fibration. Thus we have a realisation of the canonical fibrations as the trace of ρ−1 on certain conjugacy classes of geodesics.

Advertisements

Reclaim Modernity: Beyond Markets, Beyond Machines (Mark Fisher & Jeremy Gilbert)

Untitled

It is understandable that the mainstream left has traditionally been suspicious of anti-bureaucratic politics. The Fabian tradition has always believed – has been defined by its belief – in the development and extension of an enlightened bureaucracy as the main vehicle of social progress. Attacking ‘bureaucracy’ has been – since at least the 1940s – a means by which the Right has attacked the very idea of public service and collective action. Since the early days of Thatcherism, there has been very good reason to become nervous whenever someone attacks bureaucracy, because such attacks are almost invariably followed by plans not for democratisation, but for privatisation.

Nonetheless, it is precisely this situation that has produced a certain paralysis of the Left in the face of one of its greatest political opportunities, an opportunity which it can only take if it can learn to speak an anti-bureaucratic language with confidence and conviction. On the one hand, this is a simple populist opportunity to unite constituencies within both the public and private sectors: simple, but potentially strategically crucial. As workers in both sectors and as users of public services, the public dislike bureaucracy and apparent over-regulation. The Left misses an enormous opportunity if it fails to capitalise on this dislike and transform it into a set of democratic demands.

On the other hand, anti-bureaucratism marks one of the critical points of failure and contradiction in the entire neoliberal project. For the truth is that neoliberalism has not kept its promise in this regard. It has not reduced the interference of managerial mechanisms and apparently pointless rules and regulations in the working life of public-sector professionals, or of public-service users, or of the vast majority of workers in the private sector. In fact it has led in many cases to an enormous proliferation and intensification of just these processes. Targets, performance indicators, quantitative surveys and managerial algorithms dominate more of life today than ever before, not less. The only people who really suffer less regulation than they did in the past are the agents of finance capital: banks, traders, speculators and fund managers.

Where de-regulation is a reality for most workers is not in their working lives as such, but in the removal of those regulations which once protected their rights to secure work, and to a decent life outside of work (pensions, holidays, leave entitlements, etc.). The precarious labour market is not a zone of freedom for such workers, but a space in which the fact of precarity itself becomes a mechanism of discipline and regulation. It only becomes a zone of freedom for those who already have enough capital to be able to choose when and where to work, or to benefit from the hyper-mobility and enforced flexibility of contemporary capitalism.

Reclaiming Modernity Beyond Markets Beyond Machines

The Concern for Historical Materialism. Thought of the Day 53.0

plaksin-a-spectrum-of-glass-1920

The concern for historical materialism, in spite of Marx’s differentiation between history and pre-history, is that totalisation might not be historically groundable after all, and must instead be constituted in other ways: whether logically, transcendentally or naturally. The ‘Consciousness’ chapter of the Phenomenology, a blend of all three, becomes a transcendent(al) logic of phenomena – individual, universal, particular – and ceases to provide any genuine phenomenology of ‘the experience of consciousness’. Natural consciousness is not strictly speaking a standpoint (no real opposition), so it can offer no critical grounds of itself to confer synthetic unity upon the universal, that which is taken to a higher level in ‘Self-Consciousness’ (only to be retrospectively confirmed). Yet Hegel does just this from the outset. In ‘Perception’, we read that, ‘[o]n account of the universality [Allgemeinheit] of the property, I must … take the objective essence to be on the whole a community [Gemeinschaft]’. Universality always sides with community, the Allgemeine with the Gemeinschaft, as if the synthetic operation had taken place prior to its very operability. Unfortunately for Hegel, the ‘free matters’ of all possible properties paves the way for the ‘interchange of forces’ in ‘Force and the Understanding’, and hence infinity, life and – spirit. In the midst of the master-slave dialectic, Hegel admits that, ‘[i]n this movement we see repeated the process which represented itself as the play of forces, but repeated now in consciousness [sic].

Iain Hamilton Grant’s Schelling in Opposition to Fichte. Note Quote.

33576_640

The stated villain of Philosophies of Nature is not Hegelianism but rather ‘neo-Fichteanism’. It is Grant’s ‘Philosophies of Nature After Schelling‘, which takes up the issue of graduating Schelling to escape the accoutrements of Kantian and Fichtean narrow transcendentalism. Grant frees Schelling from the grips of narrow minded inertness and mechanicality in nature that Kant and Fichte had presented nature with. This idea is the Deleuzean influence on Grant. Manuel De Landa makes a vociferous case in this regard. According to De Landa, the inertness of matter was rubbished by Deleuze in the way that Deleuze sought for a morphogenesis of form thereby launching a new kind of materialism. This is the anti-essentialist position of Deleuze. Essentialism says that matter and energy are inert, they do not have any morphogenetic capabilities. They cannot give rise to new forms on their own. Disciplines like complexity theory, non-linear dynamics do give matter its autonomy over inertness, its capabilities in terms of charge. But its account of the relationship between Fichte and Schelling actually obscures the rich meaning of speculation in Hegel and after. Grant quite accurately recalls that Schelling confronted Fichte’s identification of the ‘not I’ with passive nature – the consequence of identifying all free activity with the ‘I’ alone. For Grant, that which Jacobi termed ‘speculative egotism’ becomes the nightmare of modern philosophy and of technological modernity at large. The ecological concern is never quite made explicit in Philosophies of Nature. Yet Grant’s introduction to Schelling’s On the World Soul helps to contextualise the meaning of his ‘geology of morals’.

What we miss from Grant’s critique of Fichte is the manner by which the corrective, positive characterisation of nature proceeds from Schelling’s confirmation of Fichte’s rendering of the fact of consciousness (Tatsache) into the act of consciousness (Tathandlung). Schelling, as a consequence, becomes singularly critical of contemplative speculation, since activity now implies working on nature and thereby changing it – along with it, we might say – rather than either simply observing it or even experimenting upon it.

In fact, Grant reads Schelling only in opposition to Fichte, with drastic consequences for his speculative realism: the post-Fichtean element of Schelling’s naturephilosophy allows for the new sense of speculation he will share with Hegel – even though they will indeed turn this against Kant and Fichte. Without this account, we are left with the older, contemplative understanding of metaphysical speculation, which leads to a certain methodologism in Grant’s study. Hence, ‘the principle method of naturephilosophy consists in “unconditioning” the phenomena’. Relatedly, Meillassoux defines the ‘speculative’ as ‘every type of thinking’ – not acting, – ‘that claims to be able to access some form of absolute’.

In direct contrast to this approach, the collective ‘system programme’ of Hegel, Schelling and Hölderlin was not a programme for thinking alone. Their revolutionised sense of speculation, from contemplation of the stars to reform of the worldly, is overlooked by today’s speculative realism – a philosophy that, ‘refuses to interrogate reality through human (linguistic, cultural or political) mediations of it’. We recall that Kant similarly could not extend his Critique to speculative reason precisely on account of his contemplative determination of pure reason (in terms of the hierarchical gap between reason and the understanding). Grant’s ‘geology of morals’ does not oppose ‘Kanto-Fichtean philosophy’, as he has it, but rather remains structurally within the sphere of Kant’s pre-political metaphysics.

Quantum Energy Teleportation. Drunken Risibility.

dizzzdergunov

Time is one of the most difficult concepts in physics. It enters in the equations in a rather artificial way – as an external parameter. Although strictly speaking time is a quantity that we measure, it is not possible in quantum physics to define a time-observable in the same way as for the other quantities that we measure (position, momentum, etc.). The intuition that we have about time is that of a uniform flow, as suggested by the regular ticks of clocks. Time flows undisturbed by the variety of events that may occur in an irregular pattern in the world. Similarly, the quantum vacuum is the most regular state one can think of. For example, a persistent superconducting current flows at a constant speed – essentially forever. Can then one use the quantum vacuum as a clock? This is a fascinating dispute in condensed-matter physics, formulated as the problem of existence of time crystals. A time crystal, by analogy with a crystal in space, is a system that displays a time-regularity under measurement, while being in the ground (vacuum) state.

Then, if there is an energy (the zero-point energy) associated with empty space, it follows via the special theory of relativity that this energy should correspond to an inertial mass. By the principle of equivalence of the general theory of relativity, inertial mass is identical with the gravitational mass. Thus, empty space must gravitate. So, how much does empty space weigh? This question brings us to the frontiers of our knowledge of vacuum – the famous problem of the cosmological constant, a problem that Einstein was wrestling with, and which is still an open issue in modern cosmology.

Finally, although we cannot locally extract the zero-point energy of the vacuum fluctuations, the vacuum state of a field can be used to transfer energy from one place to another by using only information. This protocol has been called quantum energy teleportation and uses the fact that different spatial regions of a quantum field in the ground state are entangled. It then becomes possible to extract locally energy from the vacuum by making a measurement in one place, then communicating the result to an experimentalist in a spatially remote region, who would be able then to extract energy by making an appropriate (depending on the result communicated) measurement on her or his local vacuum. This suggests that the vacuum is the primordial essence, the ousia from which everything came into existence.

Potential Synapses. Thought of the Day 52.0

For a neuron to recognize a pattern of activity it requires a set of co-located synapses (typically fifteen to twenty) that connect to a subset of the cells that are active in the pattern to be recognized. Learning to recognize a new pattern is accomplished by the formation of a set of new synapses collocated on a dendritic segment.

Untitled

Figure: Learning by growing new synapses. Learning in an HTM neuron is modeled by the growth of new synapses from a set of potential synapses. A “permanence” value is assigned to each potential synapse and represents the growth of the synapse. Learning occurs by incrementing or decrementing permanence values. The synapse weight is a binary value set to 1 if the permanence is above a threshold.

Figure shows how we model the formation of new synapses in a simulated Hierarchical Temporal Memory (HTM) neuron. For each dendritic segment we maintain a set of “potential” synapses between the dendritic segment and other cells in the network that could potentially form a synapse with the segment. The number of potential synapses is larger than the number of actual synapses. We assign each potential synapse a scalar value called “permanence” which represents stages of growth of the synapse. A permanence value close to zero represents an axon and dendrite with the potential to form a synapse but that have not commenced growing one. A 1.0 permanence value represents an axon and dendrite with a large fully formed synapse.

The permanence value is incremented and decremented using a Hebbian-like rule. If the permanence value exceeds a threshold, such as 0.3, then the weight of the synapse is 1, if the permanence value is at or below the threshold then the weight of the synapse is 0. The threshold represents the establishment of a synapse, albeit one that could easily disappear. A synapse with a permanence value of 1.0 has the same effect as a synapse with a permanence value at threshold but is not as easily forgotten. Using a scalar permanence value enables on-line learning in the presence of noise. A previously unseen input pattern could be noise or it could be the start of a new trend that will repeat in the future. By growing new synapses, the network can start to learn a new pattern when it is first encountered, but only act differently after several presentations of the new pattern. Increasing permanence beyond the threshold means that patterns experienced more than others will take longer to forget.

HTM neurons and HTM networks rely on distributed patterns of cell activity, thus the activation strength of any one neuron or synapse is not very important. Therefore, in HTM simulations we model neuron activations and synapse weights with binary states. Additionally, it is well known that biological synapses are stochastic, so a neocortical theory cannot require precision of synaptic efficacy. Although scalar states and weights might improve performance, they are not required from a theoretical point of view.

Creation of a Bacterial Cell Controlled by a Chemically Synthesized Genome

Untitled

The design, synthesis and assembly of the 1.08- Mbp Mycoplasma mycoides JCVI-syn1.0 genome starting from digitized genome sequence information and its transplantation into a Mycoplasma capricolum recipient cell to create new Mycoplasma mycoides cells are controlled only by the synthetic chromosome. The only DNA in the cells is the designed synthetic DNA sequence, including “watermark” sequences and other designed gene deletions and polymorphisms, and mutations acquired during the building process. The new cells have expected phenotypic properties and are capable of continuous self-replication. Creation of a Bacterial Cell Controlled by a Chemically Synthesized Genome