Abstraction as Dissection of a Flat “Ontology”: The Illusiveness of Levels (Paper)

DeLanda:

…while an ontology based on relations between general types and particular instances is hierarchical, each level representing a different ontological category (organism, species, genera) [or strings, quarks, baryons], an approach in terms of interacting parts and emergent wholes leads to a flat ontology, one made exclusively of unique, singular individuals, differing in spatio-temporal scale but not in ontological status.

The following discussion, however, seeks to go further than DeLanda’s account of hierarchy, extending it to all entities of spatio-temporal entities, thus the interjection of “strings, quarks, baryons” into the quote. That this extension is natural should be clear once van Fraassen’s role in this level-denying consilience. Furthermore, van Fraassen’s account will be employed to illustrate why any level-like organization attributed to the components of an explanation has no bearing on the explanation, and arises due to two things:

1) erroneously clumping together all types of belief statements into a single branch of philosophy that deals with knowledge, and

2) attempting to stratify the causal thicket that is the world, so as conform the produce of scientific enterprise with monism and fundamentalist predilections.

Finally, there is one other way in which the account differs from DeLanda’s flat ontology, when touching upon Kant’s antinomy of teleology: the idea that levels of mechanisms telescopes to a flat ontology, as every part and whole enjoys the same status in a scientific explanation, and only differ in size.

Untitled

Figure: The larger arrows point toward the conclusion of consilience. The smaller arrows suggest relationships that create cooperation toward the thesis. The blue bubbles represent the positive web of notions that cohere, and the red bubbles are those notions that are excluded from web. The red arrows indicate where the ideas not included in the web arise, and the totality of this paper works toward a final explication as to why these are to be excluded. There are a few thin blue lines that are not included, because they would make a mess of the image, such as a line connecting abstraction and James’ pragmatism. However, the paper (Abstraction as Dissection of a Flat Ontologythrough) endeavors to make these connections clear, for example, quoting James to show that James presents an idea that seems a precursor to Cartwright’s notion of abstraction. (Note: Yellow lines are really blue lines, but are yellow to avoid confusion that might ensue from blue lines passing through blue bubbles. Green lines are to indicate additivity. The red lines denote notions not connected to the web, yet bear some relation to ideas in the web.) 

“The Conceptual Penis as a Social Construct”: Sokal-Like Hoax Returns to Test Academic Left’s Moral (Architecture + Orthodox Gender Studies) and Cripples It.

conceptual_penis_cogent_gender_studies

Destructive, unsustainable hegemonically male approaches to pressing environmental policy and action are the predictable results of a raping of nature by a male-dominated mindset. This mindset is best captured by recognizing the role of [sic] the conceptual penis holds over masculine psychology. When it is applied to our natural environment, especially virgin environments that can be cheaply despoiled for their material resources and left dilapidated and diminished when our patriarchal approaches to economic gain have stolen their inherent worth, the extrapolation of the rape culture inherent in the conceptual penis becomes clear…….Toxic hypermasculinity derives its significance directly from the conceptual penis and applies itself to supporting neocapitalist materialism, which is a fundamental driver of climate change, especially in the rampant use of carbon-emitting fossil fuel technologies and careless domination of virgin natural environments. We need not delve deeply into criticisms of dialectic objectivism, or their relationships with masculine tropes like the conceptual penis to make effective criticism of (exclusionary) dialectic objectivism. All perspectives matter.

The androcentric scientific and meta-scientific evidence that the penis is the male reproductive organ is considered overwhelming and largely uncontroversial.”

That’s how we began. We used this preposterous sentence to open a “paper” consisting of 3,000 words of utter nonsense posing as academic scholarship. Then a peer-reviewed academic journal in the social sciences accepted and published it.

This paper should never have been published. Titled, “The Conceptual Penis as a Social Construct,” our paper “argues” that “The penis vis-à-vis maleness is an incoherent construct. We argue that the conceptual penis is better understood not as an anatomical organ but as a gender-performative, highly fluid social construct.” As if to prove philosopher David Hume’s claim that there is a deep gap between what is and what ought to be, our should-never-have-been-published paper was published in the open-access (meaning that articles are freely accessible and not behind a paywall), peer-reviewed journal Cogent Social Sciences.

Assuming the pen names “Jamie Lindsay” and “Peter Boyle,” and writing for the fictitious “Southeast Independent Social Research Group,” we wrote an absurd paper loosely composed in the style of post-structuralist discursive gender theory. The paper was ridiculous by intention, essentially arguing that penises shouldn’t be thought of as male genital organs but as damaging social constructions. We made no attempt to find out what “post-structuralist discursive gender theory” actually means. We assumed that if we were merely clear in our moral implications that maleness is intrinsically bad and that the penis is somehow at the root of it, we could get the paper published in a respectable journal.

This already damning characterization of our hoax understates our paper’s lack of fitness for academic publication by orders of magnitude. We didn’t try to make the paper coherent; instead, we stuffed it full of jargon (like “discursive” and “isomorphism”), nonsense (like arguing that hypermasculine men are both inside and outside of certain discourses at the same time), red-flag phrases (like “pre-post-patriarchal society”), lewd references to slang terms for the penis, insulting phrasing regarding men (including referring to some men who choose not to have children as being “unable to coerce a mate”), and allusions to rape (we stated that “manspreading,” a complaint levied against men for sitting with their legs spread wide, is “akin to raping the empty space around him”). After completing the paper, we read it carefully to ensure it didn’t say anything meaningful, and as neither one of us could determine what it is actually about, we deemed it a success.

Why did Boghossian and Lindsay do this?

Sokal exposed an infatuation with academic puffery that characterizes the entire project of academic postmodernism. Our aim was smaller yet more pointed. We intended to test the hypothesis that flattery of the academic Left’s moral architecture in general, and of the moral orthodoxy in gender studies in particular, is the overwhelming determiner of publication in an academic journal in the field. That is, we sought to demonstrate that a desire for a certain moral view of the world to be validated could overcome the critical assessment required for legitimate scholarship. Particularly, we suspected that gender studies is crippled academically by an overriding almost-religious belief that maleness is the root of all evil. On the evidence, our suspicion was justified.

In the words of Graham Harman,

We kind of deserve it. There is still far too much empty jargon of this sort in the humanities and social sciences fields. Quite aside from whether or not you find the jargon off-putting, it leads to very bad writing, and when writing sounds bad it’s a much more serious sign of bad thinking than most people realize. (Nietzsche was on to this a long time ago, when he said that the only way to improve you writing is to improve your thoughts. Methodologically, I find the converse to be true as well. It is through trying to make your thoughts more readable that you make them better thoughts.) And again, I was one of the few people in the environs of continental philosophy who deeply enjoyed the original Sokal hoax. Until we stop writing (and thinking) like this, we will be repeatedly targeted by such hoaxes, and they will continue to sneak through. We ought to be embarrassed by this, and ought to ask ourselves some tough questions about our disciplinary norms, rather than pretending to be outraged at the “unethical behavior” of the hoax authors.

Endless turf war….

The authors worry that gender studies folk will believe that, “…men do often suffer from machismo braggadocio, and that there is an isomorphism between these concepts via some personal toxic hypermasculine conception of their penises.” But I don’t really see why a gender studies academic wouldn’t believe this… This is NOT a case of cognitive dissonance.

As much as the authors like to pretend like they have “no idea” what they are talking about, they clearly do. They are taking existing gender study ideas and just turning up the volume and adding more jargon. As if this proves a point against the field.

The author’s biases are on their sleeve. Their arguments are about as effective as a Men’s Rights Activist on Reddit. By using a backhanded approach in an attempt to give a coup de grace to gender studies academaniacs, all they’ve done is blow $625 and “exposed” the already well known issue of pay-to-play. If they wanted to make an actual case against the “feminazis” writ large, I suggest they “man” up and actually make a real argument rather than show a bunch of fancy words can fool some people. Ah!, but far from being a meta-analytical multiplier of defense, quantum homeomorphism slithers through the conceptual penis!

Without Explosions, WE Would NOT Exist!

bb_theory

The matter and radiation in the universe gets hotter and hotter as we go back in time towards the initial quantum state, because it was compressed into a smaller volume. In this Hot Big Bang epoch in the early universe, we can use standard physical laws to examine the processes going on in the expanding mixture of matter and radiation. A key feature is that about 300,000 years after the start of the Hot Big Bang epoch, nuclei and electrons combined to form atoms. At earlier times when the temperature was higher, atoms could not exist, as the radiation then had so much energy it disrupted any atoms that tried to form into their constituent parts (nuclei and electrons). Thus at earlier times matter was ionized, consisting of negatively charged electrons moving independently of positively charged atomic nuclei. Under these conditions, the free electrons interact strongly with radiation by Thomson scattering. Consequently matter and radiation were tightly coupled in equilibrium at those times, and the Universe was opaque to radiation. When the temperature dropped through the ionization temperature of about 4000K, atoms formed from the nuclei and electrons, and this scattering ceased: the Universe became very transparent. The time when this transition took place is known as the time of decoupling – it was the time when matter and radiation ceased to be tightly coupled to each other, at a redshift zdec ≃ 1100 (Scott Dodelson (Auth.)-Modern Cosmology-Academic Press). By

μbar ∝ S−3, μrad ∝ S−4, Trad ∝ S−1 —– (1)

The scale factor S(t) obeys the Raychaudhuri equation

3S ̈/S = -1/2 κ(μ +3p/c2) + Λ —– (2)

where κ is the gravitational constant and Λ the cosmological constant.

, the universe was radiation dominated (μrad ≫ μmat) at early times and matter dominated (μrad ≪ μmat) at late times; matter-radiation density equality occurred significantly before decoupling (the temperature Teq when this equality occurred was Teq ≃ 104K; at that time the scale factor was Seq ≃ 104S0, where S0 is the present-day value). The dynamics of both the background model and of perturbations about that model differ significantly before and after Seq.

Radiation was emitted by matter at the time of decoupling, thereafter travelling freely to us through the intervening space. When it was emitted, it had the form of blackbody radiation, because this is a consequence of matter and radiation being in thermodynamic equilibrium at earlier times. Thus the matter at z = zdec forms the Last Scattering Surface (LSS) in the early universe, emitting Cosmic Blackbody Background Radiation (‘CBR’) at 4000K, that since then has travelled freely with its temperature T scaling inversely with the scale function of the universe. As the radiation travelled towards us, the universe expanded by a factor of about 1100; consequently by the time it reaches us, it has cooled to 2.75 K (that is, about 3 degrees above absolute zero, with a spectrum peaking in the microwave region), and so is extremely hard to observe. It was however detected in 1965, and its spectrum has since been intensively investigated, its blackbody nature being confirmed to high accuracy (R. B. Partridge-3K_ The Cosmic Microwave Background Radiation). Its existence is now taken as solid proof both that the Universe has indeed expanded from a hot early phase, and that standard physics applied unchanged at that era in the early universe.

The thermal capacity of the radiation is hugely greater than that of the matter. At very early times before decoupling, the temperatures of the matter and radiation were the same (because they were in equilibrium with each other), scaling as 1/S(t) (Equation 1 above). The early universe exceeded any temperature that can ever be attained on Earth or even in the centre of the Sun; as it dropped towards its present value of 3 K, successive physical reactions took place that determined the nature of the matter we see around us today. At very early times and high temperatures, only elementary particles can survive and even neutrinos had a very small mean free path; as the universe cooled down, neutrinos decoupled from the matter and streamed freely through space. At these times the expansion of the universe was radiation dominated, and we can approximate the universe then by models with {k = 0, w = 1/3, Λ = 0}, the resulting simple solution of

3S ̇2/S2 = A/S3 + B/S4 + Λ/3 – 3k/S2 —– (3)

uniquely relating time to temperature:

S(t)=S0t1/2 , t=1.92sec [T/1010K]−2 —– (4)

(There are no free constants in the latter equation).

At very early times, even neutrinos were tightly coupled and in equilibrium with the radiation; they decoupled at about 1010K, resulting in a relic neutrino background density in the universe today of about Ων0 ≃ 10−5 if they are massless (but it could be higher depending on their masses). Key events in the early universe are associated with out of equilibrium phenomena. An important event was the era of nucleosynthesis, the time when the light elements were formed. Above about 109K, nuclei could not exist because the radiation was so energetic that as fast as they formed, they were disrupted into their constituent parts (protons and neutrons). However below this temperature, if particles collided with each other with sufficient energy for nuclear reactions to take place, the resultant nuclei remained intact (the radiation being less energetic than their binding energy and hence unable to disrupt them). Thus the nuclei of the light elements  – deuterium, tritium, helium, and lithium – were created by neutron capture. This process ceased when the temperature dropped below about 108K (the nuclear reaction threshold). In this way, the proportions of these light elements at the end of nucleosynthesis were determined; they have remained virtually unchanged since. The rate of reaction was extremely high; all this took place within the first three minutes of the expansion of the Universe. One of the major triumphs of Big Bang theory is that theory and observation are in excellent agreement provided the density of baryons is low: Ωbar0 ≃ 0.044. Then the predicted abundances of these elements (25% Helium by weight, 75% Hydrogen, the others being less than 1%) agrees very closely with the observed abundances. Thus the standard model explains the origin of the light elements in terms of known nuclear reactions taking place in the early Universe. However heavier elements cannot form in the time available (about 3 minutes).

In a similar way, physical processes in the very early Universe (before nucleosynthesis) can be invoked to explain the ratio of matter to anti-matter in the present-day Universe: a small excess of matter over anti-matter must be created then in the process of baryosynthesis, without which we could not exist today (if there were no such excess, matter and antimatter would have all annihilated to give just radiation). However other quantities (such as electric charge) are believed to have been conserved even in the extreme conditions of the early Universe, so their present values result from given initial conditions at the origin of the Universe, rather than from physical processes taking place as it evolved. In the case of electric charge, the total conserved quantity appears to be zero: after quarks form protons and neutrons at the time of baryosynthesis, there are equal numbers of positively charged protons and negatively charged electrons, so that at the time of decoupling there were just enough electrons to combine with the nuclei and form uncharged atoms (it seems there is no net electrical charge on astronomical bodies such as our galaxy; were this not true, electromagnetic forces would dominate cosmology, rather than gravity).

After decoupling, matter formed large scale structures through gravitational instability which eventually led to the formation of the first generation of stars and is probably associated with the re-ionization of matter. However at that time planets could not form for a very important reason: there were no heavy elements present in the Universe. The first stars aggregated matter together by gravitational attraction, the matter heating up as it became more and more concentrated, until its temperature exceeded the thermonuclear ignition point and nuclear reactions started burning hydrogen to form helium. Eventually more complex nuclear reactions started in concentric spheres around the centre, leading to a build-up of heavy elements (carbon, nitrogen, oxygen for example), up to iron. These elements can form in stars because there is a long time available (millions of years) for the reactions to take place. Massive stars burn relatively rapidly, and eventually run out of nuclear fuel. The star becomes unstable, and its core rapidly collapses because of gravitational attraction. The consequent rise in temperature blows it apart in a giant explosion, during which time new reactions take place that generate elements heavier than iron; this explosion is seen by us as a Supernova (“New Star”) suddenly blazing in the sky, where previously there was just an ordinary star. Such explosions blow into space the heavy elements that had been accumulating in the star’s interior, forming vast filaments of dust around the remnant of the star. It is this material that can later be accumulated, during the formation of second generation stars, to form planetary systems around those stars. Thus the elements of which we are made (the carbon, nitrogen, oxygen and iron nuclei for example) were created in the extreme heat of stellar interiors, and made available for our use by supernova explosions. Without these explosions, we could not exist.

Conjuncted: Demise of Ontology

string_theory_11322

The demise of ontology in string theory opens new perspectives on the positions of Quine and Larry Laudan. Laudan stressed the discontinuity of ontological claims throughout the history of scientific theories. String theory’s comment on this observation is very clear: The ontological claim is no appropriate element of highly developed physical theories. External ontological objects are reduced to the status of an approximative concept that only makes sense as long as one does not look too closely into the theory’s mathematical fine-structure. While one may consider the electron to be an object like a table, just smaller, the same verdict on, let’s say, a type IIB superstring is not justifiable. In this light it is evident that an ontological understanding of scientific objects cannot have any realist quality and must always be preliminary. Its specific form naturally depends on the type of approximation. Eventually all ontological claims are bound to evaporate in the complex structures of advanced physics. String theory thus confirms Laudan’s assertion and integrates it into a solid physical background picture.

In a remarkable way string theory awards new topicality to Quine’s notion of underdeterminism. The string theoretical scale-limit to new phenomenology that makes Quine’s concept of a theoretical scheme fits all possible phenomenological data. In a sense string theory moves Quine’s concept from the regime of abstract and shadowy philosophical definitions to the regime of the physically meaningful. Quine’s notion of underdeterminism also remains unaffected by the emerging principle of theoretical uniqueness, which so seriously undermines the position of modest underdeterminism. Since theoretical uniqueness reveals itself in the context of new so far undetected phenomenology, Quine’s purely ontological approach remains safely beyond its grasp. But the best is still to come: The various equivalent superstring theories appear as empirically equivalent but ‘logically incompatible’ theories of the very type implied by Quine’s underdeterminism hypothesis. The different string theories are not theoretically incompatible and unrelated concepts. On the contrary they are merely different representations of one overall theoretical structure. Incompatible are the ontological claims which can be imputed to the various representations. It is only at this level that Quine’s conjecture applies to string theory. And it is only at this level that it can be meaningful at all. Quine is no adherent of external realism and thus can afford a very wide interpretation of the notion ‘ontological object’. For him a world view’s ontology can well comprise oddities like spacetime points or mathematical sets. In this light the duality phenomenon could be taken to imply a shift of ontology away from an external ‘corporal’ regime towards a purely mathematical one. 

To put external and mathematical ontologies into the same category blurs the central message the new physical developments have in store for philosophy of science. This message emerges much clearer if formulated within the conceptual framework of scientific realism: An extrapolation of the notion ‘external ontological object’ from the visible to the invisible regime remains possible up to quantum field theory if one wants to have it. It fails fundamentally at the stage of string theory. String theory simply is no theory about invisible external objects.

Duality’s Anti-Realism or Poisoning Ontological Realism: The Case of Vanishing Ontology. Note Quote.

M_Systems_-_A__html_m65d67aa7

If the intuitive quality of the external ontological object is diminished piece by piece during the evolutionary progress of physical theory (which must be acknowledged also in a hidden parameter framework), is there any core of the notion of an ontological object at all that can be trusted to be immune against scientific decomposition?

Quantum mechanics cannot answer this question. Contemporary physics is in a quite different position. The full dissolution of ontology is a characteristic process of particle physics whose unfolding starts with quantum mechanics and gains momentum in gauge field theory until, in string theory, the ontological object has simply vanished.

The concept to be considered is string duality, with the remarkable phenomenon of T-duality according to which a string wrapped around a small compact dimension can as well be understood as a string that is not wrapped but moves freely along a large compact dimension. The phenomenon is rooted in the quantum principles but clearly transcends what one is used to in the quantum world. It is not a mere case of quantum indeterminacy concerning two states of the system. We rather face two theoretical formulations which are undistinguishable in principle so that they cannot be interpreted as referring to two different states at all. Nevertheless the two formulations differ in characteristics which lie at the core of any meaningful ontology of an external world. They differ in the shape of space-time and they differ in form and topological position of the elementary objects. The fact that those characteristics are reduced to technical parameters whose values depend on the choice of the theoretical formulation contradicts ontological scientific realism in the most straightforward way. If a situation can be described by two different sets of elementary objects depending on the choice of the theoretical framework, how can it make sense to assert that these ontological objects actually exist in an external world?

The question gets even more virulent as T-duality by no means remains the only duality relation that surfaces in string theory. It turns out that the existence of dualities is one of string theory’s most characteristic features. They seem to pop up wherever one looks for them. Probably the most important role played by duality relations today is to connect all different superstring theories. Before 1995 physicists knew 5 different types of superstring theory. Then it turned out that these 5 theories and a 6th by then unknown theory named ‘M-theory’ are interconnected by duality relations. Two types of duality are involved. Some theories can be transformed into each other through inversion of a compactification radius, which is the phenomenon we know already under the name of T-duality. Others can be transformed into each other by inversion of the string coupling constant. This duality is called S-duality. Then there is M-theory, where the string coupling constant is transformed into an additional 11th dimension whose size is proportional to the coupling strength of the dual theory. The described web of dualities connects theories whose elementary objects have different symmetry structure and different dimensionality. M-theory even has a different number of spatial dimensions than its co-theories. Duality nevertheless implies that M-theory and the 5 possible superstring theories only represent different formulations of one single actual theory. This statement constitutes the basis for string theory’s uniqueness claims and shows the pivotal role played by the duality principle.

An evaluation of the philosophical implications of duality in modern string theory must first acknowledge that the problems to identify uniquely the ontological basis of a scientific theory are as old as the concept of invisible scientific objects itself. Complex theories tend to allow the insertion of ontology at more than one level of their structure. It is not a priori clear in classical electromagnetism whether the field or the potential should be understood as the fundamental physical object and one may wonder similarly in quantum field theory whether that concept’s basic object is the particle or the field. Questions of this type clearly pose a serious philosophical problem. Some philosophers like Quine have drawn the conclusion to deny any objective basis for the imputation of ontologies. Philosophers with a stronger affinity for realism however often stress that there do exist arguments which are able to select a preferable ontological set after all. It might also be suggested that ontological alternatives at different levels of the theoretical structure do not pose a threat to realism but should be interpreted merely as different parameterisations of ontological reality. The problem is created at a philosophical level by imputing an ontology to a physical theory whose structure neither depends on nor predetermines uniquely that imputation. The physicist puts one compact theoretical structure into space-time and the philosopher struggles with the question at which level ontological claims should be inserted.

The implications of string-duality have an entirely different quality. String duality really posits different ‘parallel’ empirically indistinguishable versions of structure in spacetime which are based on different sets of elementary objects. This statement is placed at the physical level independently of any philosophical interpretation. Thus it transfers the problem of the lack of ontological uniqueness from a philosophical to a physical level and makes it much more difficult to cure. If theories with different sets of elementary objects give the same physical world (i. e. show the same pattern of observables), the elementary object cannot be seen as the unique foundation of the physical world any more. There seems to be no way to avoid this conclusion. There exists an additional aspect of duality that underlines its anti-ontological character. Duality does not just spell destruction for the notion of the ontological scientific object but in a sense offers a replacement as well.

Do there remain any loop-holes in duality’s anti-realist implications which could be used by the die-hard realist? A natural objection to the asserted crucial philosophical importance of duality can be based on the fact, that duality was not invented in the context of string theory. It is known since the times of P. M. Dirac that quantum electrodynamics with magnetic monopoles would be dual to a theory with inverted coupling constant and exchanged electric and magnetic charges. The question arises, if duality is poison to ontological realism, why didn’t it have its effect already at the level of quantum electrodynamics. The answer gives a nice survey of possible measures to save ontological realism. As it will turn out, they all fail in string theory.

In the case of quantum-electrodynamics the realist has several arguments to counter the duality threat. First, duality looks more like an accidental oddity that appears in an unrealistic scenario than like a characteristic feature of the world. No one has observed magnetic monopoles, which renders the problem hypothetical. And even if there were magnetic monopoles, an embedding of electromagnetism into a fuller description of the natural forces would destroy the dual structure anyway.

In string theory the situation is very different. Duality is no ‘lucky strike’ any more, which just by chance arises in a certain scenario that is not the real one anyway. As we have seen, it rather represents a core feature of the emerging theoretical structure and cannot be ignored. A second option open to the realist at the level of quantum electrodynamics is to shift the ontological posit. Some philosophers of quantum physics argue that the natural elementary object of quantum field theory is the quantum field, which represents something like the potentiality to produce elementary particles. One quantum field covers the full sum over all variations of particle exchange which have to be accounted for in a quantum process. The philosopher who posits the quantum field to be the fundamental real object discovered by quantum field theory understands the single elementary particles as mere mathematical entities introduced to calculate the behaviour of the quantum field. Dual theories from his perspective can be taken as different technical procedures to calculate the behaviour of the univocal ontological object, the electromagnetic quantum field. The phenomenon of duality then does not appear as a threat to the ontological concept per se but merely as an indication in favour of an ontologisation of the field instead of the particle.

The field theoretical approach to interpret the quantum field as the ontological object does not have any pendent in string theory. String theory only exists as a perturbative theory. There seems to be no way to introduce anything like a quantum field that would cover the full expansion of string exchanges. In the light of duality this lack of a unique ontological object arguably appears rather natural. The reason is related to another point that makes string dualities more dramatic than its field theoretical predecessor. String theory includes gravitation. Therefore object (the string geometry) and space-time are not independent. Actually it turns out that the string geometry in a way carries all information about space-time as well. This dependence of space-time on string-geometry makes it difficult already to imagine how it should be possible to put into this very spacetime some kind of overall field whose coverage of all string realisations actually implies coverage of variations of spacetime itself. The duality context makes the paradoxical quality of such an attempt more transparent. If two dual theories with different radii of a compactified dimension shall be covered by the same ontological object in analogy to the quantum field in field theory, this object obviously cannot live in space and time. If it would, it had to choose one of the two spacetime versions endorsed by the dual theories, thereby discriminating the other one. This theory however should not be expected to be a theory of objects in spacetime and therefore does not rise any hopes to redeem the external ontological perspective.

A third strategy to save ontological realism is based on the following argument: In quantum electrodynamics the difference between the dual theories boils down to a mere replacement of a weak coupling constant which allows perturbative calculation by a strong one which does not. Therefore the choice is open between a natural formulation and a clumsy untreatable one which maybe should just be discarded as an artificial construction.

Today string theory cannot tell whether its final solution will put its parameters comfortably into the low-coupling-constant-and-large-compact-dimension-regime of one of the 5 superstring theories or M-theory. This might be the case but it might as well happen, that the solution lies in a region of parameter space where no theory clearly stands out in this sense. However, even if there was one preferred theory, the simple discarding of the others could not save realism as in the case of field theory. First, the argument of natural choice is not really applicable to T-duality. A small compactification radius does not render a theory intractable like a large coupling constant. The choice of the dual version with a large radius thus looks more like a convention than anything else. Second, the choice of both compactification radii and string coupling constants in string theory is the consequence of a dynamical process that has to be calculated itself. Calculation thus stands before the selection of a certain point in parameter space and consequently also before a possible selection of the ontological objects. The ontological objects therefore, even if one wanted to hang on to their meaningfulness in the final scenario, would appear as a mere product of prior dynamics and not as a priori actors in the game.

Summing up, the phenomenon of duality is admittedly a bit irritating for the ontological realist in field theory but he can live with it. In string theory however, the field theoretical strategies to save realism all fail. The position assumed by the duality principle in string theory clearly renders obsolete the traditional realist understanding of scientific objects as smaller cousins of visible ones. The theoretical posits of string theory get their meaning only relative to their theoretical framework and must be understood as mathematical concepts without any claim to ‘corporal’ existence in an external world. The world of string theory has cut all ties with classical theories about physical bodies. To stick to ontological realism in this altered context, would be inadequate to the elementary changes which characterize the new situation. The demise of ontology in string theory opens new perspectives on the positions where the stress is on the discontinuity of ontological claims throughout the history of scientific theories.

Quantum Music

Human neurophysiology suggests that artistic beauty cannot easily be disentangled from sexual attraction. It is, for instance, very difficult to appreciate Sandro Botticelli’s Primavera, the arguably “most beautiful painting ever painted,” when a beautiful woman or man is standing in front of that picture. Indeed so strong may be the distraction, and so deep the emotional impact, that it might not be unreasonable to speculate whether aesthetics, in particular beauty and harmony in art, could be best understood in terms of surrogates for natural beauty. This might be achieved through the process of artistic creation, idealization and “condensation.”

1200px-Botticelli-primavera

In this line of thought, in Hegelian terms, artistic beauty is the sublimation, idealization, completion, condensation and augmentation of natural beauty. Very different from Hegel who asserts that artistic beauty is “born of the spirit and born again, and the higher the spirit and its productions are above nature and its phenomena, the higher, too, is artistic beauty above the beauty of nature” what is believed here is that human neurophysiology can hardly be disregarded for the human creation and perception of art; and, in particular, of beauty in art. Stated differently, we are inclined to believe that humans are invariably determined by (or at least intertwined with) their natural basis that any neglect of it results in a humbling experience of irritation or even outright ugliness; no matter what social pressure groups or secret services may want to promote.

Thus, when it comes to the intensity of the experience, the human perception of artistic beauty, as sublime and refined as it may be, can hardly transcend natural beauty in its full exposure. In that way, art represents both the capacity as well as the humbling ineptitude of its creators and audiences.

Leaving these idealistic realms and come back to the quantization of musical systems. The universe of music consists of an infinity – indeed a continuum – of tones and ways to compose, correlate and arrange them. It is not evident how to quantize sounds, and in particular music, in general. One way to proceed would be a microphysical one: to start with frequencies of sound waves in air and quantize the spectral modes of these (longitudinal) vibrations very similar to phonons in solid state physics.

For the sake of relating to music, however, a different approach that is not dissimilar to the Deutsch-Turing approach to universal (quantum) computability, or Moore’s automata analogues to complementarity: a musical instrument is quantized, concerned with an octave, realized by the eight white keyboard keys typically written c, d, e, f, g, a, b, c′ (in the C major scale).

In analogy to quantum information quantization of tones is considered for a nomenclature in analogy to classical musical representation to be further followed up by introducing typical quantum mechanical features such as the coherent superposition of classically distinct tones, as well as entanglement and complementarity in music…..quantum music

Causality

Quantum_Computer

Causation is a form of event generation. To present an explicit definition of causation requires introducing some ontological concepts to formally characterize what is understood by ‘event’.

The concept of individual is the basic primitive concept of any ontological theory. Individuals associate themselves with other individuals to yield new individuals. It follows that they satisfy a calculus, and that they are rigorously characterized only through the laws of such a calculus. These laws are set with the aim of reproducing the way real things associate. Specifically, it is postulated that every individual is an element of a set s in such a way that the structure S = ⟨s, ◦, ◻⟩ is a commutative monoid of idempotents. This is a simple additive semi-group with neutral element.

In the structure S, s is the set of all individuals, the element ◻ ∈ s is a fiction called the null individual, and the binary operation ◦ is the association of individuals. Although S is a mathematical entity, the elements of s are not, with the only exception of ◻, which is a fiction introduced to form a calculus. The association of any element of s with ◻ yields the same element. The following definitions characterize the composition of individuals.

1. x ∈ s is composed ⇔ (∃ y, z) s (x = y ◦ z)
2. x ∈ s is simple ⇔ ∼ (∃ y, z) s (x = y ◦ z)
3. x ⊂ y ⇔ x ◦ y = y (x is part of y ⇔ x ◦ y = y)
4. Comp(x) ≡ {y ∈ s|y ⊂ x} is the composition of x.

Real things are distinguished from abstract individuals because they have a number of properties in addition to their capability of association. These properties can be intrinsic (Pi) or relational (Pr). The intrinsic properties are inherent and they are represented by predicates or unary applications, whereas relational properties depend upon more than a single thing and are represented by n-ary predicates, with n ≥ 1. Examples of intrinsic properties are electric charge and rest mass, whereas velocity of macroscopic bodies and volume are relational properties.

An individual with its properties make up a thing X : X =< x, P(x) >

Here P(x) is the collection of properties of the individual x. A material thing is an individual with concrete properties, i.e. properties that can change in some respect.

The state of a thing X is a set of functions S(X) from a domain of reference M (a set that can be enumerable or nondenumerable) to the set of properties PX. Every function in S(X) represents a property in PX. The set of the physically accessible states of a thing X is the lawful state space of X : SL(X). The state of a thing is represented by a point in SL(X). A change of a thing is an ordered pair of states. Only changing things can be material. Abstract things cannot change since they have only one state (their properties are fixed by definition).

A legal statement is a restriction upon the state functions of a given class of things. A natural law is a property of a class of material things represented by an empirically corroborated legal statement.

The ontological history h(X) of a thing X is a subset of SL(X) defined by h(X) = {⟨t, F(t)⟩|t ∈ M}

where t is an element of some auxiliary set M, and F are the functions that represent the properties of X.

If a thing is affected by other things we can introduce the following definition:

h(Y/X ) : “history of the thing Y in presence of the thing X”.

Let h(X) and h(Y) be the histories of the things X and Y, respectively. Then

h(Y/X) = {⟨t,H(t)⟩|t ∈ M},

where H≠ F is the total state function of Y as affected by the existence of X, and F is the total state function of X in the absence of Y. The history of Y in presence of X is different from the history of Y without X .

We can now introduce the notion of action:

X ▷ Y : “X acts on Y”

X ▷ Y =def h(Y/X) ≠ h(Y)

An event is a change of a thing X, i.e. an ordered pair of states:

(s1, s2) ∈ EL(X) = SL(X) × SL(X)

The space EL(X) is called the event space of X.

Causality is a relation between events, i.e. a relation between changes of states of concrete things. It is not a relation between things. Only the related concept of ‘action’ is a relation between things. Specifically,

C'(x): “an event in a thing x is caused by some unspecified event exxi“.

C'(x) =def (∃ exxi) [exxi ∈ EL(X) ⇔ xi ▷ x.

C(x, y): “an event in a thing x is caused by an event in a thing y”.

C(x, y) =def (∃ exy) [exy ∈ EL(x) ⇔ y ▷ x

In the above definitions, the notation exy indicates in the superscript the thing x to whose event space belongs the event e, whereas the subscript denotes the thing that acted triggering the event. The implicit arguments of both C’ and C are events, not things. Causation is a form of event generation. The crucial point is that a given event in the lawful event space EL(x) is caused by an action of a thing y iff the event happens only conditionally to the action, i.e., it would not be the case of exy without an action of y upon x. Time does not appear in this definition, allowing causal relations in space-time without a global time orientability or even instantaneous and non-local causation. If causation is non-local under some circumstances, e.g. when a quantum system is prepared in a specific state of polarization or spin, quantum entanglement poses no problem to realism and determinism. The quantum theory describes an aspect of a reality that is ontologically determined and with non-local relations. Under any circumstances the postulates of Special Relativity are violated, since no physical system ever crosses the barrier of the speed of light.