Stuxnet

Untitled

Stuxnet is a threat targeting a specific industrial control system likely in Iran, such as a gas pipeline or power plant. The ultimate goal of Stuxnet is to sabotage that facility by reprogramming programmable logic controllers (PLCs) to operate as the attackers intend them to, most likely out of their specified boundaries.

Stuxnet was discovered in July, but is confirmed to have existed at least one year prior and likely even before. The majority of infections were found in Iran. Stuxnet contains many features such as:

  • Self-replicates through removable drives exploiting a vulnerability allowing auto-execution. Microsoft Windows Shortcut ‘LNK/PIF’ Files Automatic File Execution Vulnerability (BID 41732)
  • Spreads in a LAN through a vulnerability in the Windows Print Spooler.
    Microsoft Windows Print Spooler Service Remote Code Execution Vulnerability (BID 43073)
  • Spreads through SMB by exploiting the Microsoft Windows Server Service RPC Handling Remote Code Execution Vulnerability (BID 31874).
  • Copies and executes itself on remote computers through network shares.
  • Copies and executes itself on remote computers running a WinCC database server.
  • Copies itself into Step 7 projects in such a way that it automatically executes when the Step 7 project is loaded.
  • Updates itself through a peer-to-peer mechanism within a LAN.
  • Exploits a total of four unpatched Microsoft vulnerabilities, two of which are previously mentioned vulnerabilities for self-replication and the other two are escalation of privilege vulnerabilities that have yet to be disclosed.
  • Contacts a command and control server that allows the hacker to download and execute code, including updated versions.
  • Contains a Windows rootkit that hide its binaries.
  • Attempts to bypass security products.
  • Fingerprints a specific industrial control system and modifies code on the Siemens PLCs to potentially sabotage the system.
  • Hides modified code on PLCs, essentially a rootkit for PLCs.

The following is a possible attack scenario. It is only speculation driven by the technical features of Stuxnet.

Industrial control systems (ICS) are operated by a specialized assembly like code on programmable logic controllers (PLCs). The PLCs are often programmed from Windows computers not connected to the Internet or even the internal network. In addition, the industrial control systems themselves are also unlikely to be connected to the Internet.

First, the attackers needed to conduct reconnaissance. As each PLC is configured in a unique manner, the attack- ers would first need the ICS’s schematics. These design documents may have been stolen by an insider or even retrieved by an early version of Stuxnet or other malicious binary. Once attackers had the design documents and potential knowledge of the computing environment in the facility, they would develop the latest version of Stuxnet. Each feature of Stuxnet was implemented for a specific reason and for the final goal of potentially sabotaging the ICS.

Attackers would need to setup a mirrored environment that would include the necessary ICS hardware, such as PLCs, modules, and peripherals in order to test their code. The full cycle may have taken six months and five to ten core developers not counting numerous other individuals, such as quality assurance and management.

In addition their malicious binaries contained driver files that needed to be digitally signed to avoid suspicion. The attackers compromised two digital certificates to achieve this task. The attackers would have needed to obtain the digital certificates from someone who may have physically entered the premises of the two companies and stole them, as the two companies are in close physical proximity.

To infect their target, Stuxnet would need to be introduced into the target environment. This may have occurred by infecting a willing or unknowing third party, such as a contractor who perhaps had access to the facility, or an insider. The original infection may have been introduced by removable drive.

Once Stuxnet had infected a computer within the organization it began to spread in search of Field PGs, which are typical Windows computers but used to program PLCs. Since most of these computers are non-networked, Stuxnet would first try to spread to other computers on the LAN through a zero-day vulnerability, a two year old vulnerability, infecting Step 7 projects, and through removable drives. Propagation through a LAN likely served as the first step and propagation through removable drives as a means to cover the last and final hop to a Field PG that is never connected to an untrusted network.

While attackers could control Stuxnet with a command and control server, as mentioned previously the key computer was unlikely to have outbound Internet access. Thus, all the functionality required to sabotage a system was embedded directly in the Stuxnet executable. Updates to this executable would be propagated throughout the facility through a peer-to-peer method established by Stuxnet.

When Stuxnet finally found a suitable computer, one that ran Step 7, it would then modify the code on the PLC. These modifications likely sabotaged the system, which was likely considered a high value target due to the large resources invested in the creation of Stuxnet.

Victims attempting to verify the issue would not see any rogue PLC code as Stuxnet hides its modifications.

While their choice of using self-replication methods may have been necessary to ensure they’d find a suitable Field PG, they also caused noticeable collateral damage by infecting machines outside the target organization. The attackers may have considered the collateral damage a necessity in order to effectively reach the intended target. Also, the attackers likely completed their initial attack by the time they were discovered.

Stuxnet dossier

Duqu 2.0

 InfectionTime

unsigned int __fastcall xor_sub_10012F6D(int encrstr, int a2)

{

  unsigned int result; // eax@2
  int v3;              // ecx@4
  if ( encrstr )
  {
    result = *(_DWORD *)encrstr ^ 0x86F186F1;
    *(_DWORD *)a2 = result;
    if ( (_WORD)result )
    {
      v3 = encrstr - a2;

do

      {
        if ( !*(_WORD *)(a2 + 2) )

break;

        a2 += 4;
        result = *(_DWORD *)(v3 + a2) ^ 0x86F186F1;
        *(_DWORD *)a2 = result;
      }
      while ( (_WORD)result );

} }

else

  {
    result = 0;
    *(_WORD *)a2 = 0;

}

  return result;
}

A closer look at the above C code reveals that the string decryptor routine actually has two parameters: “encrstr” and “a2”. First, the decryptor function checks if the input buffer (the pointer of the encrypted string) points to a valid memory area (i.e., it does not contain NULL value). After that, the first 4 bytes of the encrypted string buffer is XORed with the key “0x86F186F1” and the result of the XOR operation is stored in variable “result”. The first DWORD (first 4 bytes) of the output buffer a2 is then populated by this resulting value (*(_DWORD *)a2 = result;). Therefore, the first 4 bytes of the output buffer will contain the first 4 bytes of the cleartext string.

If the first two bytes (first WORD) of the current value stored in variable “result” contain ‘\0’ characters, the original cleartext string was an empty string and the resulting output buffer will be populated by a zero value, stored on 2 bytes. If the first half of the actual decrypted block (“result” variable) contains something else, the decryptor routine checks the second half of the block (“if ( !*(_WORD *)(a2 + 2) )”). If this WORD value is NULL, then decryption will be ended and the output buffer will contain only one Unicode character with two closing ’\0’ bytes.

If the first decrypted block doens’t contain zero character (generally this is the case), then the decryption cycle continues with the next 4-byte encrypted block. The pointer of the output buffer is incremeted by 4 bytes to be able to store the next cleartext block (”a2 += 4;”). After that, the following 4-byte block of the ”ciphertext” will be decrypted with the fixed decryption key (“0x86F186F1”). The result is then stored within the next 4 bytes of the output buffer. Now, the output buffer contains 2 blocks of the cleartext string.

The condition of the cycle checks if the decryption reached its end by checking the first half of the current decrypted block. If it did not reached the end, then the cycle continues with the decryption of the next input blocks, as described above. Before the decryption of each 4-byte ”ciphertext” block, the routine also checks the second half of the previous cleartext block to decide whether the decoded string is ended or not.

The original Duqu used a very similar string decryption routine, which we printed in the following figure below. We can see that this routine is an exact copy of the previously discussed routine (variable ”a1” is analogous to ”encrstr” argument). The only difference between the Duqu 2.0 (duqu2) and Duqu string decryptor routines is that the XOR keys differ (in Duqu, the key is”0xB31FB31F”).

We can also see that the decompiled code of Duqu contains the decryptor routine in a more compact manner (within a ”for” loop instead of a ”while”), but the two routines are essentially the same. For example, the two boundary checks in the Duqu 2.0 routine (”if ( !*(_WORD *)(a2 + 2) )” and ”while ( (_WORD)result );”) are analogous to the boundary check at the end of the ”for” loop in the Duqu routine (”if ( !(_WORD)v4 || !*(_WORD *)(result + 2) )”). Similarly, the increment operation within the head of the for loop in the Duqu sample (”result += 4”) is analogous to the increment operation ”a2 += 4;” in the Duqu 2.0 sample.

int __cdecl b31f_decryptor_100020E7(int a1, int a2)

{

  _DWORD *v2;      // edx@1
  int result;      // eax@2
  unsigned int v4; // edi@6
  v2 = (_DWORD *)a1;

if ( a1 ) {

    for ( result = a2; ; result += 4 )
    {
v4 = *v2 ^ 0xB31FB31F;
      *(_DWORD *)result = v4;
if ( !(_WORD)v4 || !*(_WORD *)(result + 2) )
        break;

++v2; }

}

else

  {
    result = 0;
    *(_WORD *)a2 = 0;

}

  return result;
}

Planetary Spirit

14a87826d72e7f19a5bc20e213129b8f

Cataclysm after cataclysm occurred, and the leaden slag of the fourth race sank to its doom, deluged by the waters of heaven and earth as they flooded the lands according to karmic law. Along with the sinking of Atlantis, which extended over several million years, new lands had been rising in other parts of the globe, and these became peopled as time went by with certain of the Atlanteans who settled there in two or three great migratory waves.

Thus the fourth root-race gave birth to the fifth whose cradleland was the Desert of Shamo or Gobi and surrounding tablelands — a country whose present sandy wastes give no hint of lands once rich with verdure, where forests and lakes witnessed a succession of civilizations as grand as any the world has ever known. Here for many millions of years, while Atlantis was involved in her death struggle, seeds of the new race were being sown in virgin soil.

Nature is beneficent in her workings. While the consequences of her human children must be met and faced by them through the working of karma and cyclic reimbodiment, yet at each new racial birth she casts her seed in freshly-turned soil, so that the child-race may be conceived in purity and nurtured in spirituality. Peopled thus with egos who had remained clean and strong through the Atlantean upheavals, and helped once again by the reentrance into their midst of semi-divine beings, the new race became a focus of spiritual light. As the Master Koot Hoomi (KH) wrote:

the highest Planetary Spirits, those, who can no longer err . . . appear on Earth but at the origin of every new human kind; at the junction of, and close of the two ends of the great cycle. And, they remain with man no longer than the time required for the eternal truths they teach to impress themselves so forcibly upon the plastic minds of the new races as to warrant them from being lost or entirely forgotten in ages hereafter, by the forthcoming generations. The mission of the planetary Spirit is but to strike the KEYNOTE OF TRUTH. — The Mahatma Letters to A. P. Sinnett, Letter IX,

Simultaneously with the establishment of the Mystery schools in Atlantis some four or five million years ago, the fifth or Aryan race was slowly coming into being, immensely aided by egos of spiritual refinement attracted there by ties of divine kinship. Gradually the soil was prepared and, the work of striking the “Keynote of Truth” having been accomplished, the demigods retired to their superior spheres. One million years ago the new race was ushered into adult existence impressed with the knowledge of “eternal truths.”

As the centuries passed and civilization succeeded civilization, the love of truth once again became dimmed in human hearts and the ancient precepts fell into disuse. The Mysteries were withdrawn even further, so that the knowledge once universal became the prized guerdon bestowed by the great Brotherhood upon that choice minority whose lives were dedicated to truth and truth alone, unstained by weakness or selfish ambition. With enduring consistency the ongoing purpose of the Mysteries has remained threefold in character:

(1) the persistent spiritualization of the thought-life of humanity so that knowledge of things spiritual may penetrate into the heart, and life in time may become a benediction of peace instead of a tragedy of conflict;

(2) seeding grounds of adepts, nurseries for future recruits, who through trial and initiation may become fit to receive the supreme dignity of membership in the great Brotherhood; and

(3) the preservation of truth for future races unsullied by human hand; and the polishing of the knowledge of truth through investigation by trained seers of the secrets of nature in worlds visible and invisible.

The first of these aims is fulfilled by the periodic appearance of world teachers, the inspirers of what later became the great religious and philosophical schools: messengers from the Lodge who come forth at cyclic periods to strike anew the “Keynote of Truth.” Hence every great religion, every noble philosophy, every fundamental scientific insight was born from the Sanctuary, to become a new religion, a new philosophy, a new science: fresh and new for the age and the people, but ancient beyond time because nurtured in the womb of esoteric antiquity.

All that is good, noble, and grand in human nature, every divine faculty and aspiration, were cultured by the Priest-Philosophers who sought to develop them in their Initiates. Their code of ethics, based on altruism, has become universal. — “The Origin of the Mysteries,” Blavatsky Collected Writings

The second of these aims is ages-long in accomplishment and deeply occult: to rouse the hidden fire of divinity in the human soul, and through the kindling of that flame burn the dross of imperfection, sloth, and unworthy desire from the heart. One of the impelling aims of such discipline is to restore to humanity inner sight, to free people “from every danger of being enslaved whether by a man or an idea”.

The disciple must become vajradhara (“diamond-bearer”), a title used for Bodhisattva Gautama, whose many-faceted heart was ever merciful in reflecting human sorrow, but whose spiritual essence was like a diamond, unyielding at its core to the subtle disguise of illusion (maya).

The third of these aims is made possible through the selection of new recruits into the Brotherhood, so that (a) truth may be preserved untarnished by human selfishness; and (b) investigation into the arcana of nature may go on unhindered, and the results of such examination by generations of trained seers be checked and rechecked, and only then recorded as occult fact for the benefit of humanity.

As far as the labor of the Masters is concerned, the following written by one of their number in 1881 speaks for itself:

If, for generations we have “shut out the world from the Knowledge of our Knowledge,” it is on account of its absolute unfitness; and if, notwithstanding proofs given, it still refuses yielding to evidence, then will we at the End of this cycle retire into solitude and our kingdom of silence once more. . . . We have offered to exhume the primeval strata of man’s being, his basic nature, and lay bare the wonderful complications of his inner Self — something never to be achieved by physiology or even psychology in its ultimate expression — and demonstrate it scientifically. It matters not to them, if the excavations be so deep, the rocks so rough and sharp, that in diving into that, to them, fathomless ocean, most of us perish in the dangerous exploration; for it is we who were the divers and the pioneers and the men of science have but to reap where we have sown. It is our mission to plunge and bring the pearls of Truth to the surface; theirs — to clean and set them into scientific jewels. And, if they refuse to touch the ill-shapen, oyster-shell, insisting that there is, nor cannot be any precious pearl inside it, then shall we once more wash our hands of any responsibility before human-kind.– Mahatma Letters, 

Unthanked, unknown, unconsidered, the Masters go on in their compassionate work for mankind’s enlightenment, a work that has never ceased in its outpouring of spiritual vitality for many millions of years, to continue another such period if necessity demand, until such time as humanity stirs from its lethargy and once again wills to unite its heart with truth. Master KH continues:

For countless generations hath the adept builded a fane of imperishable rocks, a giant’s Tower of INFINITE THOUGHT, wherein the Titan dwelt, and will yet, if need be, dwell alone, emerging from it but at the end of every cycle, to invite the elect of mankind to co-operate with him and help in his turn enlighten superstitious man. And we will go on in that periodical work of ours; we will not allow ourselves to be baffled in our philanthropic attempts until that day when the foundations of a new continent of thought are so firmly built that no amount of opposition and ignorant malice guided by the Brethren of the Shadow will be found to prevail.– Mahatma Letters.

Conceptual Jump Size & Solomonoff Induction. Note Quote.

u4DeG

Let M be a reference machine which corresponds to a universal computer with a prefix-free code. In a prefix-free code, no code is a prefix of another. This is also called a self-delimiting code, as most reasonable computer programming languages are. Ray Solomonoff inquired the probability that an output string x is generated by M considering the whole space of possible programs. By giving each program bitstring p an a priori probability of 2−|p|, we can ensure that the space of programs meets the probability axioms by the extended Kraft inequality. An instantaneous code (prefix code, tree code) with the code word lengths l1,…,lN exists if and only if

i=1N 2-Li ≤ 1

In other words, we imagine that we toss a fair coin to generate each bit of a random program. This probability model of programs entails the following probability mass function (p.m.f.) for strings x ∈ {0, 1}∗:

PM(x) = ∑M(p)=x* 2-|p| —– (1)

which is the probability that a random program will output a prefix of x. PM(x) is called the algorithmic probability of x for it assumes the definition of program-based probability.

Using this probability model of bitstrings, one can make predictions. Intuitively, we can state that it is impossible to imagine intelligence in the absence of any prediction ability: purely random behavior is decisively non-intelligent. Since, P is a universal probability model, it can be used as the basis of universal prediction, and thus intelligence. Perhaps, Solomonoff’s most significant contributions were in the field of AI, as he envisioned a machine that can learn anything from scratch.

His main proposal for machine learning is inductive inference (Part 1, Part 2), for a variety of problems such as sequence prediction, set induction, operator induction and grammar induction. Without much loss of generality, we can discuss sequence prediction on bitstrings. Assume that there is a computable p.m.f. of bitstrings P1. Given a bitstring x drawn from P1, we can define the conditional probability of the next bit simply by normalizing. Algorithmically, we would have to approximate (1) by finding short programs that generate x (the shortest of which is the most probable). In more general induction, we run all models in parallel, quantifying fit-to-data, weighed by the algorithmic probability of the model, to find the best models and construct distributions; the common point being determining good models with high a priori probability. Finding the shortest program in general is undecidable, however, Levin search can be used for this purpose. There are two important results about Solomonoff induction that we shall mention here. First, Solomonoff induction converges very rapidly to the real probability distribution. The convergence theorem shows that the expected total square error is related only to the algorithmic complexity of P1, which is independent from x. The following bound is discussed at length with a concise proof:

EP [∑m=1n (P(am+1 = 1|a1a2 …am) – P1(am+1 = 1|a1a2…am))2] ≤ -1/2 ln P(P1) —– (2)

This bound characterizes the divergence of the Algorithmic Probability (ALP) solution from the real probability distribution P1. P(P1) is the a priori probability of P1 p.m.f. according to our universal distribution PM. On the right hand side of (2), −lnPM(P1) is roughly kln2 where k is the Kolmogorov complexity of P1 (the length of the shortest program that defines it), thus the total expected error is bounded by a constant, which guarantees that the error decreases very rapidly as example size increases. In algorithmic information theory, the Kolmogorov complexity of an object, such as a piece of text, is the length of the shortest computer program  that produces the object as output. It is measure of the computational resources needed to specify the object, and is also known as descriptive complexity, Kolmogorov–Chaitin complexity, algorithmic entropy, or program-size complexity. Secondly, there is an optimal search algorithm to approximate Solomonoff induction, which adopts Levin’s universal search method to solve the problem of universal induction. Universal search procedure time-shares all candidate programs according to their a priori probability with a clever watch-dog policy to avoid the practical impact of the undecidability of the halting problem. The search procedure starts with a time limit t = t0, in its iteration tries all candidate programs c with a time limit of t.P(c), and while a solution is not found, it doubles the time limit t. The time t(s)/P (s) for a solution program s taking time t(s) is called the Conceptual Jump Size (CJS), and it is easily shown that Levin Search terminates in at most 2.CJS time. To obtain alternative solutions, one may keep running after the first solution is found, as there may be more probable solutions that need more time. The optimal solution is computable only in the limit, which turns out to be a desirable property of Solomonoff induction, as it is complete and uncomputable.

The Transmission of Affect, or Brennan’s Argument Against Neo-Darwinism…Note Quote

[According to neo-Darwinism], the individual organism is born with the urges and affects that will determine its fate. Its predisposition to certain behaviors is part of its individual genetic package, and, of course, these behaviors are intrinsically affective. Such behaviors and affects may be modified by the environment, or they may not survive because they are not adaptive. But the point is that no other source or origin for the affects is acknowledged outside of the individual one. The dominant model for transmission in neo-Darwinism is genetic transmission… and the critical thing about it here is that its proponents ignore the claims of social and historical context when it comes to accounting for causation.

As Brennan convincingly argues below, the neo-Darwinist adopts an essentialist position that neglects to engage at all with the capacity of affects to occur outside of the genetically formed individual. 

f1-large

To be sure, in both biological and non-biological contexts, the neo-Darwinian paradigm negates the creative potential of chance encounters by grossly inflating the status of a deterministic code mechanism. By analogy it attributes the same high level of agency to the fidelity, fecundity and longevity of the genetic package as it does to the passive passing on of a competing idea. Memetics crudely consigns, as such, the by and large capricious, unconscious and imitative transmission of desire and social invention through a population to an insentient surrender to a self-serving code.

Complex Life, Randomised Paragraph

In this informationscape, one of the most prominent identities of the gene is as information, code, program, blueprint, recipe, and ‘book of life’. These metaphors dominate popular understanding of genetics and molecular biology. A good deal of scepticism has been expressed about these ideas by biologists and philosophers of biology. The idea of genetic information has been dismissed as merely metaphorical, or, slightly more positively, as referring to a loose collection of analogical models which make heuristic use of several different points of resemblance between molecular processes and human communication systems. Invoking the physics and chemistry underlying the substrate as a promising area in breaking the so-called perennial mystery argues for the importance of Francis Crick’s 1958 definition, which relates directly to the actual genetic code: “information means here the precise determination of sequence, either of bases in the nucleic acid or of amino acid residues in the protein.” Surprisingly, this definition can be generalised to apply to non-genetic factors in development. Doing so makes it possible to state clearly why the origin of nucleic acid-based heredity was an evolutionary ‘key innovation’ which made possible the evolution of complex life, while maintaining a balanced view of the role of genetic and other causes in the developmental biology of modern organisms. Quoting Deleuze in the appendix to his book on Foucault, where the very notion of overman, superman or expansiveness is lying outside the constraint of the usual anthropological forms of the human, and this expansiveness could be looked into by two means viz, the potential foldings proper to the chains of the genetic code, and the potential of silicon in the third generation machines. It is the latter versus the former, which is crucial here for consciousness to break free of human-centric logic.

Nucleosome With Dna Model