Diagrammatic Political Via The Exaptive Processes

thing politics v2x copy

The principle of individuation is the operation that in the matter of taking form, by means of topological conditions […] carries out an energy exchange between the matter and the form until the unity leads to a state – the energy conditions express the whole system. Internal resonance is a state of the equilibrium. One could say that the principle of individuation is the common allagmatic system which requires this realization of the energy conditions the topological conditions […] it can produce the effects in all the points of the system in an enclosure […]

This operation rests on the singularity or starting from a singularity of average magnitude, topologically definite.

If we throw in a pinch of Gilbert Simondon’s concept of transduction there’s a basis recipe, or toolkit, for exploring the relational intensities between the three informal (theoretical) dimensions of knowledge, power and subjectification pursued by Foucault with respect to formal practice. Supplanting Foucault’s process of subjectification with Simondon’s more eloquent process of individuation marks an entry for imagining the continuous, always partial, phase-shifting resolutions of the individual. This is not identity as fixed and positionable, it’s a preindividual dynamic that affects an always becoming- individual. It’s the pre-formative as performative. Transduction is a process of individuation. It leads to individuated beings, such as things, gadgets, organisms, machines, self and society, which could be the object of knowledge. It is an ontogenetic operation which provisionally resolves incompatibilities between different orders or different zones of a domain.

What is at stake in the bigger picture, in a diagrammatic politics, is double-sided. Just as there is matter in expression and expression in matter, there is event-value in an  exchange-value paradigm, which in fact amplifies the force of its power relations. The economic engine of our time feeds on event potential becoming-commodity. It grows and flourishes on the mass production of affective intensities. Reciprocally, there are degrees of exchange-value in eventness. It’s the recursive loopiness of our current Creative Industries diagram in which the social networking praxis of Web 2.0 is emblematic and has much to learn.

Advertisement

Without Explosions, WE Would NOT Exist!

bb_theory

The matter and radiation in the universe gets hotter and hotter as we go back in time towards the initial quantum state, because it was compressed into a smaller volume. In this Hot Big Bang epoch in the early universe, we can use standard physical laws to examine the processes going on in the expanding mixture of matter and radiation. A key feature is that about 300,000 years after the start of the Hot Big Bang epoch, nuclei and electrons combined to form atoms. At earlier times when the temperature was higher, atoms could not exist, as the radiation then had so much energy it disrupted any atoms that tried to form into their constituent parts (nuclei and electrons). Thus at earlier times matter was ionized, consisting of negatively charged electrons moving independently of positively charged atomic nuclei. Under these conditions, the free electrons interact strongly with radiation by Thomson scattering. Consequently matter and radiation were tightly coupled in equilibrium at those times, and the Universe was opaque to radiation. When the temperature dropped through the ionization temperature of about 4000K, atoms formed from the nuclei and electrons, and this scattering ceased: the Universe became very transparent. The time when this transition took place is known as the time of decoupling – it was the time when matter and radiation ceased to be tightly coupled to each other, at a redshift zdec ≃ 1100 (Scott Dodelson (Auth.)-Modern Cosmology-Academic Press). By

μbar ∝ S−3, μrad ∝ S−4, Trad ∝ S−1 —– (1)

The scale factor S(t) obeys the Raychaudhuri equation

3S ̈/S = -1/2 κ(μ +3p/c2) + Λ —– (2)

where κ is the gravitational constant and Λ the cosmological constant.

, the universe was radiation dominated (μrad ≫ μmat) at early times and matter dominated (μrad ≪ μmat) at late times; matter-radiation density equality occurred significantly before decoupling (the temperature Teq when this equality occurred was Teq ≃ 104K; at that time the scale factor was Seq ≃ 104S0, where S0 is the present-day value). The dynamics of both the background model and of perturbations about that model differ significantly before and after Seq.

Radiation was emitted by matter at the time of decoupling, thereafter travelling freely to us through the intervening space. When it was emitted, it had the form of blackbody radiation, because this is a consequence of matter and radiation being in thermodynamic equilibrium at earlier times. Thus the matter at z = zdec forms the Last Scattering Surface (LSS) in the early universe, emitting Cosmic Blackbody Background Radiation (‘CBR’) at 4000K, that since then has travelled freely with its temperature T scaling inversely with the scale function of the universe. As the radiation travelled towards us, the universe expanded by a factor of about 1100; consequently by the time it reaches us, it has cooled to 2.75 K (that is, about 3 degrees above absolute zero, with a spectrum peaking in the microwave region), and so is extremely hard to observe. It was however detected in 1965, and its spectrum has since been intensively investigated, its blackbody nature being confirmed to high accuracy (R. B. Partridge-3K_ The Cosmic Microwave Background Radiation). Its existence is now taken as solid proof both that the Universe has indeed expanded from a hot early phase, and that standard physics applied unchanged at that era in the early universe.

The thermal capacity of the radiation is hugely greater than that of the matter. At very early times before decoupling, the temperatures of the matter and radiation were the same (because they were in equilibrium with each other), scaling as 1/S(t) (Equation 1 above). The early universe exceeded any temperature that can ever be attained on Earth or even in the centre of the Sun; as it dropped towards its present value of 3 K, successive physical reactions took place that determined the nature of the matter we see around us today. At very early times and high temperatures, only elementary particles can survive and even neutrinos had a very small mean free path; as the universe cooled down, neutrinos decoupled from the matter and streamed freely through space. At these times the expansion of the universe was radiation dominated, and we can approximate the universe then by models with {k = 0, w = 1/3, Λ = 0}, the resulting simple solution of

3S ̇2/S2 = A/S3 + B/S4 + Λ/3 – 3k/S2 —– (3)

uniquely relating time to temperature:

S(t)=S0t1/2 , t=1.92sec [T/1010K]−2 —– (4)

(There are no free constants in the latter equation).

At very early times, even neutrinos were tightly coupled and in equilibrium with the radiation; they decoupled at about 1010K, resulting in a relic neutrino background density in the universe today of about Ων0 ≃ 10−5 if they are massless (but it could be higher depending on their masses). Key events in the early universe are associated with out of equilibrium phenomena. An important event was the era of nucleosynthesis, the time when the light elements were formed. Above about 109K, nuclei could not exist because the radiation was so energetic that as fast as they formed, they were disrupted into their constituent parts (protons and neutrons). However below this temperature, if particles collided with each other with sufficient energy for nuclear reactions to take place, the resultant nuclei remained intact (the radiation being less energetic than their binding energy and hence unable to disrupt them). Thus the nuclei of the light elements  – deuterium, tritium, helium, and lithium – were created by neutron capture. This process ceased when the temperature dropped below about 108K (the nuclear reaction threshold). In this way, the proportions of these light elements at the end of nucleosynthesis were determined; they have remained virtually unchanged since. The rate of reaction was extremely high; all this took place within the first three minutes of the expansion of the Universe. One of the major triumphs of Big Bang theory is that theory and observation are in excellent agreement provided the density of baryons is low: Ωbar0 ≃ 0.044. Then the predicted abundances of these elements (25% Helium by weight, 75% Hydrogen, the others being less than 1%) agrees very closely with the observed abundances. Thus the standard model explains the origin of the light elements in terms of known nuclear reactions taking place in the early Universe. However heavier elements cannot form in the time available (about 3 minutes).

In a similar way, physical processes in the very early Universe (before nucleosynthesis) can be invoked to explain the ratio of matter to anti-matter in the present-day Universe: a small excess of matter over anti-matter must be created then in the process of baryosynthesis, without which we could not exist today (if there were no such excess, matter and antimatter would have all annihilated to give just radiation). However other quantities (such as electric charge) are believed to have been conserved even in the extreme conditions of the early Universe, so their present values result from given initial conditions at the origin of the Universe, rather than from physical processes taking place as it evolved. In the case of electric charge, the total conserved quantity appears to be zero: after quarks form protons and neutrons at the time of baryosynthesis, there are equal numbers of positively charged protons and negatively charged electrons, so that at the time of decoupling there were just enough electrons to combine with the nuclei and form uncharged atoms (it seems there is no net electrical charge on astronomical bodies such as our galaxy; were this not true, electromagnetic forces would dominate cosmology, rather than gravity).

After decoupling, matter formed large scale structures through gravitational instability which eventually led to the formation of the first generation of stars and is probably associated with the re-ionization of matter. However at that time planets could not form for a very important reason: there were no heavy elements present in the Universe. The first stars aggregated matter together by gravitational attraction, the matter heating up as it became more and more concentrated, until its temperature exceeded the thermonuclear ignition point and nuclear reactions started burning hydrogen to form helium. Eventually more complex nuclear reactions started in concentric spheres around the centre, leading to a build-up of heavy elements (carbon, nitrogen, oxygen for example), up to iron. These elements can form in stars because there is a long time available (millions of years) for the reactions to take place. Massive stars burn relatively rapidly, and eventually run out of nuclear fuel. The star becomes unstable, and its core rapidly collapses because of gravitational attraction. The consequent rise in temperature blows it apart in a giant explosion, during which time new reactions take place that generate elements heavier than iron; this explosion is seen by us as a Supernova (“New Star”) suddenly blazing in the sky, where previously there was just an ordinary star. Such explosions blow into space the heavy elements that had been accumulating in the star’s interior, forming vast filaments of dust around the remnant of the star. It is this material that can later be accumulated, during the formation of second generation stars, to form planetary systems around those stars. Thus the elements of which we are made (the carbon, nitrogen, oxygen and iron nuclei for example) were created in the extreme heat of stellar interiors, and made available for our use by supernova explosions. Without these explosions, we could not exist.

Conjuncted: Demise of Ontology

string_theory_11322

The demise of ontology in string theory opens new perspectives on the positions of Quine and Larry Laudan. Laudan stressed the discontinuity of ontological claims throughout the history of scientific theories. String theory’s comment on this observation is very clear: The ontological claim is no appropriate element of highly developed physical theories. External ontological objects are reduced to the status of an approximative concept that only makes sense as long as one does not look too closely into the theory’s mathematical fine-structure. While one may consider the electron to be an object like a table, just smaller, the same verdict on, let’s say, a type IIB superstring is not justifiable. In this light it is evident that an ontological understanding of scientific objects cannot have any realist quality and must always be preliminary. Its specific form naturally depends on the type of approximation. Eventually all ontological claims are bound to evaporate in the complex structures of advanced physics. String theory thus confirms Laudan’s assertion and integrates it into a solid physical background picture.

In a remarkable way string theory awards new topicality to Quine’s notion of underdeterminism. The string theoretical scale-limit to new phenomenology that makes Quine’s concept of a theoretical scheme fits all possible phenomenological data. In a sense string theory moves Quine’s concept from the regime of abstract and shadowy philosophical definitions to the regime of the physically meaningful. Quine’s notion of underdeterminism also remains unaffected by the emerging principle of theoretical uniqueness, which so seriously undermines the position of modest underdeterminism. Since theoretical uniqueness reveals itself in the context of new so far undetected phenomenology, Quine’s purely ontological approach remains safely beyond its grasp. But the best is still to come: The various equivalent superstring theories appear as empirically equivalent but ‘logically incompatible’ theories of the very type implied by Quine’s underdeterminism hypothesis. The different string theories are not theoretically incompatible and unrelated concepts. On the contrary they are merely different representations of one overall theoretical structure. Incompatible are the ontological claims which can be imputed to the various representations. It is only at this level that Quine’s conjecture applies to string theory. And it is only at this level that it can be meaningful at all. Quine is no adherent of external realism and thus can afford a very wide interpretation of the notion ‘ontological object’. For him a world view’s ontology can well comprise oddities like spacetime points or mathematical sets. In this light the duality phenomenon could be taken to imply a shift of ontology away from an external ‘corporal’ regime towards a purely mathematical one. 

To put external and mathematical ontologies into the same category blurs the central message the new physical developments have in store for philosophy of science. This message emerges much clearer if formulated within the conceptual framework of scientific realism: An extrapolation of the notion ‘external ontological object’ from the visible to the invisible regime remains possible up to quantum field theory if one wants to have it. It fails fundamentally at the stage of string theory. String theory simply is no theory about invisible external objects.