Intrinsic Propensity for Vulnerability in Computers? Arbitrary Code Execution in the Universal Turing Machine

The universal Turing machine is generally considered to be the simplest, most abstract model of a computer. This paper reports on the discovery of an accidental arbitrary code execution vulnerability in Marvin Minsky's 1967 implementation of the universal Turing machine. By submitting crafted data, the machine may be coerced into executing user-provided code. The article presents the discovered vulnerability in detail and discusses its potential implications. To the best of our knowledge, an arbitrary code execution vulnerability has not previously been reported for such a simple system.

READ FULL TEXT VIEW PDF

Authors

10/16/2021

What can we learn from universal Turing machines?

In the present paper, we construct what we call a pedagogical universal ...
10/18/2021

Turing Tumble is Turing-Complete

It is shown that the toy Turing Tumble, suitably extended with an infini...
10/01/2021

The Turing machine of a harmonic oscillator: from the code to the dynamic system

In this work we consider a dynamic system consisting of a damped harmoni...
12/03/2021

A reliable Turing machine

We consider computations of a Turing machine subjected to noise. In ever...
10/12/2018

A Model for Auto-Programming for General Purposes

The Universal Turing Machine (TM) is a model for VonNeumann computers --...
08/03/2019

Topological Interpretation of Interactive Computation

It is a great pleasure to write this tribute in honor of Scott A. Smolka...
08/13/2015

Logical N-AND Gate on a Molecular Turing Machine

In Boolean algebra, it is known that the logical function that correspon...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.

1 Introduction

Arbitrary code execution holds a special position among malicious exploits. Most remarkable is the case when such code execution is effected through the submission of crafted data. In computing, the relationship between structure and behavior, between program and process, is perplexing in itself. That this relationship so often can be subverted, allowing an untrusted data provider to preternaturally gain control over program execution, is disquieting. Why is this a common phenomenon in computer systems? Is it the consequence of incidental but unfortunate decisions in the development history of those systems, or is it rather the result of some fundamental property of computing?

Commonly used to explore the foundational traits of computers and computing, the universal Turing machine is generally considered one of the most important ideas in computer science. Turing presented his universal machine in a paper in 1936 [14], where he promptly used it to solve one of the most pressing mathematical questions of the day, David Hilbert and Wilhelm Ackermann so called Entscheidungsproblem [7]. As expressed by Marvin Misky, “the universal machine quickly leads to some striking theorems bearing on what appears to be the ultimate futility of attempting to obtain effective criteria for effectiveness itself” [8] . But the universal Turing machine achieved more than that. As stated by Davis, Sigal and Weyuker in [4], “Turing’s construction of a universal computer in 1936 provided reason to believe that, at least in principle, an all-purpose computer would be possible, and was thus an anticipation of the modern digital computer.” Or, in the words of Stephen Wolfram [16], ‘what launched the whole computer revolution is the remarkable fact that universal systems with fixed underlying rules can be built that can in effect perform any possible computation.” Not only the universality, but also the simplicity of the universal Turing machine has attracted interest. In 1956, Claude Shannon explored some minimal forms of the universal Turing machine [13], and posed the challenge to find even smaller such machines. That exploration has continued to this day [16].

A common strategy for understanding a problem is to reduce it to its minimal form. In the field of computer security, we may ask the question: "What is the simplest system exploitable to arbitrary code execution?" In this article, we propose an answer to that question by reporting on the discovery that a well-established implementation [8] of the universal Turing machine is vulnerable to a both unintentional and non-trivial form of arbitrary code execution.

The article proceeds in the next section with a background to arbitrary code execution. Section 3 reviews universal Turing machines, and in particular the studied implementation. This is followed by a detailed analysis of the discovered vulnerability. In Section 5, we consider changes to the explored implementation that would mitigate the vulnerability. The paper concludes with a discussion on the significance of the findings, and some conclusions.

2 Arbitrary Code Execution

That a software user who is nominally only granted the possibility to provide some trivial data, such as her name, sometimes, by carefully crafting that seemingly inconsequential data, is able to take full control of the computer executing that software, is remarkable indeed. It is even more arresting that such arbitrary code execution vulnerabilities are quite frequently discovered in software systems. Arbitrary code execution is not a fringe phenomenon, but a material class of vulnerabilities in modern computer systems. There are several specific types of vulnerabilities that may lead to arbitrary code execution. Among the 2019 CWE (Common Weakness Enumeration) Top 25 Most Dangerous Software Errors the following may lead to arbitrary code execution [9],

CWE-119 Improper Restriction of Operations within the Bounds of a Memory Buffer
CWE-79 Improper Neutralization of Input During Web Page Generation
CWE-20 Improper Input Validation
CWE-89 Improper Neutralization of Special Elements used in an SQL Command
CWE-416 Use After Free
CWE-190 Integer Overflow or Wraparound
CWE-78 Improper Neutralization of Special Elements used in an OS Command
CWE-787 Out-of-bounds Write
CWE-476 NULL Pointer Dereference
CWE-434 Unrestricted Upload of File with Dangerous Type
CWE-94 Improper Control of Generation of Code
CWE-502 Deserialization of Untrusted Data
Table 1: Vulnerabilities that may lead to code execution

Those twelve constitute half of that top 25 list, highlighting the prevalence of this class of vulnerability. It is not, however, clear whether there is any common underlying cause to these vulnerabilities; is there any root explanation as to why they are so prevalent?

3 Universal Turing machine

As preparation for the presentation of the arbitrary code execution vulnerability in Section 4, we review the concept of the Turing machine, the universal Turing machine, and the Minsky implementation of that universal machine.

3.1 Turing machine

A Turing machine, , is a finite-state machine operating on a tape by means of the machine’s head (cf. Figure 1). The tape has the form of a sequence of squares onto one of which the head is positioned. The head can read and write symbols located in the currently scanned square. It can also move one square to the left or right.

Figure 1: A Turing machine.

The input to the finite-state machine is the currently scanned symbol, while the output is the printed symbol as well as the direction in which the head is to move. The finite-state machine, which thus controls the actions of the head, can therefore be represented as a quintuple,

where represents the source state, the scanned symbol, the target state, the printed symbol and the direction in which the head is to move.

3.2 Universal Turing machine

A universal Turing machine, , is a Turing machine that is capable of simulating any other Turing machine, . There are multiple implementations of the universal Turing machine, the first one notably being the one proposed by Alan Turing himself in [14]. In this article, we consider the universal Turing machine proposed by Marvin Minsky in [8] . Our choice of the Minsky version is mainly based on (i) the ease with which it can be implemented, (ii) the ease with which it can be explained in a brief article, and (iii) it’s solid place in computer science literature, presented by Marvin Minsky in his much-cited book Computation: Finite and Infinite Machines (1967). Turing’s own universal machine is arguably more convoluted, and also contains a number of errors [3]. Other universal Turing machines include a set of minimally small universal Turing machines, counting the size of their alphabet and finite-machine state space [11][17]. Those machines, however, add cognitive complexity by introducing an additional formalism (a tag system) in order to minimize the size of the machines.

Figure 2: Finite-state machine of Marvin Minsky’s universal Turing machine.

3.2.1 Machine structure

In the words of Minsky himself, the universal machine, ,

will be given just the necessary materials: a description, on its tape, of and of [the initial configuration on s own, simulated tape] ; some working space; and the built-in capacity to interpret correctly the rules of operation as given in the description of . Its behavior will be very simple. will simulate the behavior of one step at a time. It will be told by a marker M at what point on its tape begins, and then it will keep a complete account of what

’s tape looks like at each moment. It will remember what state

is supposed to be in, and it can see what would read on the ’simulated’ tape. Then will simply look at the description of to see what is next supposed to do, and do it! This really involves no more than looking up, in a table of quintuples, to find out what symbol to write, which way to move, and what new state to go into. We will assume that has a tape which is infinite only to the left, and that it is a binary (2-symbol) machine. These restrictions are inessential, but make matters much simpler.

Concretely, s tape is divided into four regions. The infinite region to the left will be the tape of the simulated machine . The second region, , contains the name of the current state of . The third region, stores the value of the symbol under ’s head. Together, we denote the machine condition. The fourth region, will contain the machine description of , i.e. the program.

Figure 3: The state machine of a binary counter Turing machine.

If is to simulate the binary counter represented in Figure 3, ’s tape may initially be configured as follows,

...00000M000Y001X0000001X0010110X0100011X0110100Y00...
                

where the arrow points to the location of ’s head, the M marks the location of ’s head, the leftmost Y separates ’s tape from , and the leftmost X identifies the start of the machine description, . Within , quintuples are separated by Xs, and the rightmost Y marks the end of the machine description.

The machine description of is constituted of a set of quintuples, , recorded in binary format. An example would be 0010110, where , , , , and . Any number of binary digits may be used to represent the machine state, . We adopt the convention that indicates a shift of the head to the left, while shifts the head to the right.

3.2.2 Machine execution

’s finite-state machine, presented in Figure 2, is constituted of four distinct phases, marked as i-iv in the figure. The first phase uses ’s state, , and the symbol under ’s head, , to identify the next quintuple to execute. A recurring approach for marking positions is to recast 0s and 1s into As and Bs. The example tape presented above would by the first phase be modified to

...00000M000Y001XAAAAAABXAAB0110X0100011X0110100Y00...
                 

where the transition from As and Bs into 0s and 1s specifies the position of the quintuple (0010110, in the example) matching .

From the identified quintuple, the second phase copies the target state, (01), and the symbol to be written, (1), to and , and remembers the direction by entering into the appropriate state, (state 13 or 14 in Figure 2).

...00000M000YABBXAAAAAABXAABABBAX0100011X0110100Y00...
                

The third phase records that direction by replacing ’s head symbol M with an A or B (A in the example), performs some clean-up, replaces the symbol to be written, , stored in , with an S, and instead remembers .

...00000A000Y01SX0000001X0010110X0100011X0110100Y00...
              

The fourth and final phase performs the actual operations of : it prints (1) in the appropriate location on ’s tape, places an M to the left or right of that symbol depending on , and performs some final clean-up.

...0000M1000Y01AX0000001X0010110X0100011X0110100Y00...
              

At this point, the first execution cycle is complete, and the second cycle begins.

4 Exploiting the Universal Turing Machine

Users of computer systems typically provide the input to, or argument of the computations, and are provided the results. From the point of view of computer security, it is typically undesirable to allow the user to subvert the functionality of the program performing the function. A malicious actor may, however, attempt to do so. A particularly serious security vulnerability is when it is possible for the end user to provide maliciously crafted data that effectively allows the execution of arbitrary code. In this section, we demonstrate that Marvin Minsky’s universal Turing Machine suffers from an arbitrary code execution vulnerability.

4.1 Trust boundary

There is one obvious trust boundary in a universal Turing machine, : the initial string on the tape of the simulated Turing machine, . That string corresponds to the user-provided data of an ordinary computer program. Because the potential users may be unknown to the developers and administrators of the computer and its programs, it is common to view this data as untrusted. In our explorations of the universal Turing machine, we will make the same assumption. Therefore, if it were possible to execute arbitrary code without manipulating the program of , but only by providing crafted data on ’s simulated tape, that would constitute a vulnerability.

4.2 Requirements on the machine description

Nearly all possible machine descriptions of appear to be vulnerable to arbitrary code execution. We consider the case when the first executed quintuple is of the form 0. The final symbol is thus fixed to a 0, indicating that the direction of the head must thus not shift right in the first execution cycle. This is arguably in line with Minsky, p.138, [8]: "We will assume that has a tape which is infinite only to the left […]". Because exploitation occurs already in the first executed quintuple, additional quintuples will not affect the outcome.

4.3 The exploit

The following crafted input data will achieve arbitrary code execution by injecting a new Turing machine , and coercing into simulating it:

YAXX...S

where represents the input data provided to the injected machine, , denotes the name of ’s current state, and are the quintuples of . All variable values need to be coded as Bs and As instead of 1s and 0s. if of the first executed quintuple and if .

4.4 Example exploit

We explain the exploitation mechanism by an example, where ’s machine description, , consists of a simple program acting as a binary counter, according to Figure 3,


’s initial tape is laid out as follows:

M000Y001X0000001X0010110X0100011X0110100Y00

The injected machine, , will aim to wipe the tape clean of user input, thus writing a 0 whenever encountering either a 0 or a 1, and then shifting left. This can be accomplished with two quintuples:


According to the previous subsection, the crafted input will take the form

1111YBAAXAAAAAAAXAABAAAAS

where 1111 is the data on which the injected machine, will operate, is the injected machine state, is the currently scanned symbol, and AAAAAAAXAABAAAA encodes ’s machine description, , thus representing the wiper program.

4.4.1 First execution cycle

At the start of execution, ’s tape has the following appearance:

...001111YBAAXAAAAAAAXAABAAAASM000Y001X0000001X0010110X0...
                                      

with ’s head positioned on the X between ’s currently scanned symbol, , and machine description, . The first three phases of follow the description in Section 3, finding the identity of the quintuple, 0010110 stored in the machine condition, , and replacing that machine condition with the action part of the identified quintuple, .

...001111YBAAXAAAAAAAXAABAAAASA000Y01SX0000001X0010110X0...
                                    

In the fourth phase, aims to perform the action on ’s tape as specified by the retrieved quintuple. It does write a 1 at the expected location, but then, however, the crafted input, consisting of As and Bs instead of the expected 0s and 1s, causes ’s head to shift far left into the user-provided data, placing the marker, M, representing ’s head, at an unexpected location.

...00111MYBAAXAAAAAAAXAABAAAAB1000Y01SX0000001X0010110X0...
                            

At the end of ’s first execution cycle, not only ’s, but also head comes to rest further to the left than expected. Importantly, ’s head is located to the left of the symbol Y indicating the end of ’s tape.

4.4.2 Second execution cycle

Because ’s head is located in the attacker-controlled segment of the tape, in it’s attempt to identify the next quintuple to execute, the first phase of ’s second execution cycle mistakenly refers to the injected machine condition, . Looking for 0s and 1s rather than As and Bs, it won’t find anything in the injected machine description, . Instead, it encounters the first match, 100 at a rather random location, just before the Y representing the end of ’s tape. The initial 1 is the result of ’s first and successful print operation. The ensuing 00 are simply a part of a buffer between ’s tape and machine condition, , as introduced by Minsky in [8].

...00111MY100XAAAAAAAXAABAAAABBAA0Y01SX0000001X0010110X0...
              

In the second phase, attempting to collect the action part of the identified quintuple, will find the four digits closest to the right of its head. While these were supposed to constitute the tail end of a quintuple, they are instead are pieces of the aforementioned buffer, of ’s machine condition, , and ’s first quintuple, jointly creating the string 0010, which is thus interpreted as . Furthermore, in the middle of the attempt to copy the first three digits to the injected machine condition, , slips back to the right of the Y indicating the start of ’s machine condition, . The end result is that the first digit is copied to while the remaining part is copied to . At the end of this phase, the complete tape has the following layout:

...00111MYA00XAAAAAAAXAABAAAABBAAAYAASXA000001X0010110X0...
              

The third phase reverts the As and Bs to 0s and 1s, and replaces ’s head, M, with a symbol indicating the direction of the next shift. ’s head is once again positioned far into the untrusted, user-provided data.

...00111AY00SX0000000X001000011000Y00SX0000001X0010110X0...
            

In the fourth and final phase of the second execution cycle, shifts ’s head one step and records in ’s machine condition, , the symbol under .

...0011M0Y00BX0000000X001000011000Y00SX0000001X0010110X0...
           

At this point, the compromise is complete, as the injected machine, , is syntactically correct, and the head of is located in the injected machine condition, , rather than in the originally intended one, . The head will never again traverse the Y denoting the end of ’s tape, interpreting it instead as the end of ’s machine description, .

4.4.3 Subsequent execution cycles

The following execution cycles will faithfully execute the injected machine, , wiping the contents of the inputs provided .

...001M00Y00BX0000000X001000011000Y00SX0000001X0010110X0...
...00M000Y00BX0000000X001000011000Y00SX0000001X0010110X0...
...0M0000Y00AX0000000X001000011000Y00SX0000001X0010110X0...
...M00000Y00AX0000000X001000011000Y00SX0000001X0010110X0...

5 Mitigations

It is possible to improve on the Minsky implementation in order to mitigate the presented vulnerability. As a first mitigation strategy, we propose to validate inputs. This could be performed by introducing a preprocessing phase validating that the simulated machine’s tape only consists of the expected 0s and 1s.

Secondly, we can restrict the execution space by reducing the number of defined quintuples. To simplify the finite-state description captured by Figure 2, Minsky declares many quintuples implicitly: “The most common quintuples, of the form (qi, sj, qi, sj, dij) are simply omitted [in the diagram].” [8]. This is convenient, because explicitly adding all necessary such quintuples to the diagram would require close to 70 arrows in addition to the 45 currently in the diagram. However, because the implicit definition creates approximately twice as many quintuples as the 70 required, this strategy allows the machine to accept many tape symbols that are not necessary for the proper functioning of the machine. Of the 70 unnecessary quintuples, close to 30 were required in order to allow the exploitation demonstrated in the previous section. This mitigation does require some effort, as the 70 required quintuples must be specified.

Thirdly, we could fortify the division between program and data. In Minsky’s implementation, only 17 of the 184 quintuples defining are supposed to operate on ’s tape. By, for instance, using special symbols on ’s tape, and ensuring that only the privileged 17 quintuples read and write on that alphabet, additional barriers to exploitation would be established.

6 Discussion

We return to the puzzling prevalence of arbitrary code execution vulnerabilities. Are these the result of some fundamental property of computing, or are they rather the consequence of coincidental, unfortunate decisions during the development of those systems?

It is interesting to note that, as was the case for Minsky’s universal Turing machine, arbitrary code execution vulnerabilities can be accidentally introduced even in the simplest computer model. Minsky obviously attempted to design neither a secure nor a vulnerable system, but despite his indifference, he happened to design a vulnerable machine. That would suggest that vulnerability is a property that is not unlikely to arise in universal Turing machines. The volume of vulnerabilities discovered in computer systems in recent decades would further support such a proposition. Is it then the case that computers are intrinsically brittle - that they at their very core have a propensity to arbitrary code execution vulnerabilities?

Considering the exploitation of the universal Turing machine in the previous section, we may speculate about reasons for such a potential propensity. One suggested root cause of computer insecurity is complexity [12]. While there is surely truth to that statement, the insecurity of Minsky’s minimally complex computing machine would appear to indicate a propensity to vulnerability even in the absence of complexity.

A related theory points the finger at the human factor [6]; poor decision-making by people is the cause of insecurity. But also this hypothesis insufficiently explains the problems of Minsky’s universal Turing machine. One could possibly argue that Marvin Minsky suffered from a lack of security awareness, but that would not alone explain the demonstrated possibility for the user to achieve code execution. There is also something inherent to the machine that makes it not only theoretically possible, but oftentimes also actually the case.

Another proposed theory blames John von Neumann’s stored program concept [15] for the woes of arbitrary code execution; the fact that data and program in computers are stored on the same storage medium may allow an attacker to illegitimately modify the program rather than the intended data [10]. Considering the universal Turing machine, this does indeed seem to have something to do with its vulnerability. The exploit demonstrated in the previous section is striking in the manner the machine head so unquestioningly saunters from program to data. This does intimate that the answer is at least partially related to the flimsy border between program and data. However, it can only be a partial answer to the question. While the co-location of program and data in the same memory unit might be a cause for the vulnerability of the universal Turing machine, of the general prevalence of buffer overflows (CWE-119) and use-after-free vulnerabilities (CWE-416), it is less obvious how it would constitute a root cause of SQL injection vulnerabilities (CWE-89) or operating system command injection vulnerabilities (CWE-78). Return-oriented programming [2] is another argument against the stored-program hypothesis: in such an attack, the program is never modified. Instead the program flow is deftly controlled by the attacker, cherry picking among assembly statements in the unmodified original program text to concoct attacker-controlled behavior. Return-oriented programming can almost always substitute modification of the actual program. Even if programs are located in a memory separate from data, they typically need to be modifiable by some means - if they are hard coded, the infinitely flexible universal Turing machine reduces to yet another specific Turing machine. And if the programs are indeed located in a separate but writeable memory (e.g. as in the Modified Harvard Architecture), then it would appear that the vulnerabilities of modifiable program code are back, e.g. as demonstrated in [5].

A final theory that may be on the cusp of explaining computers’ propensity to vulnerability is suggested by Bratus et al. in the ;login: article Exploit Programming [1], considering the weird and oftentimes surprisingly potent machines that may appear when unexpected, crafted input is provided to a computer program. A remarkable prevalence of such machines, is, I believe, at the heart of the problem.

7 Conclusion

This paper presents the discovery of an arbitrary code execution vulnerability in Marvin Minsky’s 1967 universal Turing machine implementation. By submitting crafted input data, an attacker can coerce the machine into executing arbitrary instructions. While this vulnerability has no real-world implications, we discuss whether it indicates an intrinsic propensity for arbitrary code execution vulnerabilities in computers in general.

Availability

A Python program implementing and exploiting the Minsky Turing machine considered in this paper is available on GitHub at https://github.com/intrinsic-propensity/turing-machine.

References

  • [1] S. Bratus, M. E. Locasto, M. L. Patterson, L. Sassaman, and A. Shubina (2011)

    Exploit programming: from buffer overflows to weird machines and theory of computation

    .
    USENIX; login 36 (6). Cited by: §6.
  • [2] S. Checkoway, L. Davi, A. Dmitrienko, A. Sadeghi, H. Shacham, and M. Winandy (2010) Return-oriented programming without returns. In Proceedings of the 17th ACM conference on Computer and communications security, pp. 559–572. Cited by: §6.
  • [3] B. J. Copeland (2004) The essential turing. Clarendon Press. Cited by: §3.2.
  • [4] M. Davis, R. Sigal, and E. J. Weyuker (1994) Computability, complexity, and languages: fundamentals of theoretical computer science. Elsevier. Cited by: §1.
  • [5] A. Francillon and C. Castelluccia (2008) Code injection attacks on harvard-architecture devices. In Proceedings of the 15th ACM conference on Computer and communications security, pp. 15–26. Cited by: §6.
  • [6] J. J. Gonzalez and A. Sawicka (2002) A framework for human factors in information security. In Wseas international conference on information security, Rio de Janeiro, pp. 448–187. Cited by: §6.
  • [7] D. Hilbert and W. Ackermann (1938) Grundzüge der theoretischen logik, berlin 1928. Die Grundlehren der mathematischen Wissenschaften in Einzeldarstellungen mit besonderer Berücksichtigung der Anwendungsgebiete 27. Cited by: §1.
  • [8] M. L. Minsky (1967) Computation: finite and infinite machines. Prentice-Hall Englewood Cliffs. Cited by: §1, §1, §3.2, §4.2, §4.4.2, §5.
  • [9] Mitre (2019) 2019 cwe top 25 most dangerous software errors. Note: https://cwe.mitre.org/top25/archive/2019/2019_cwe_top25.htmlAccessed: 2020-02-07 Cited by: §2.
  • [10] G. Paul (1973-July 3) Method and apparatus for providing a security system for a computer. Google Patents. Note: US Patent 3,744,034 Cited by: §6.
  • [11] Y. Rogozhin (1996) Small universal turing machines. Theoretical Computer Science 168 (2), pp. 215–240. Cited by: §3.2.
  • [12] B. Schneier (1999) A plea for simplicity: you can’t secure what you don’t understand. Information Security. Cited by: §6.
  • [13] C. E. Shannon (1956) A universal turing machine with two internal states. Automata studies 34, pp. 157–165. Cited by: §1.
  • [14] A. M. Turing (1936) On computable numbers, with an application to the entscheidungsproblem. Proceedings of the London mathematical society 2 (1), pp. 230–265. Cited by: §1, §3.2.
  • [15] J. Von Neumann (1993) First draft of a report on the edvac. IEEE Annals of the History of Computing 15 (4), pp. 27–75. Cited by: §6.
  • [16] S. Wolfram (2002) A new kind of science. Vol. 5, Wolfram media Champaign, IL. Cited by: §1.
  • [17] D. Woods and T. Neary (2009) The complexity of small universal turing machines: a survey. Theoretical Computer Science 410 (4-5), pp. 443–450. Cited by: §3.2.