The Church/Turing thesis, in computational/computability theory, is a hypothesis regarding the nature of *computable functions*. Principally, it serves as a response to the question; What does it mean for something to be *computable*? This idea of *computability* was famously made clear by Alan Turing’s theoretic notion of *Turing machines*. So, to gain a formal understanding of computability, one must then be familiar with this concept.

Given the *functional equivalence* of Turing machines and *register machines*, computability may also be understood in terms of the latter. This is not to say that the two are identical, as register machines are far less advanced. Rather, this is to say that they are both capable of carrying out all and only the same procedures. Register machines are theoretical entities which are types of *formal systems*, meaning that they can be physically implemented, and are made up of *states* and *rules*. The states of a register machine are simply the *contents* of a finite sequence of *registers*, or *memory locations*, that are uniquely addressed as follows; *R0, R1, R2*…etc. A register’s content, which is represented by a natural number, indicates the number of arbitrary items within that register. Thus, states of register machines can be expressed as finite sequences of natural numbers. Moreover, due to the fact that the kind of items contained within the registers does not matter, a register machine with the contents; one, three, two, in its first three registers for example, is *isomorphic* to a sequence of piles containing; one, three and two pebbles, respectively. As for the rules of register machines, there is only one, which takes the form of a *program*, or *processing unit*. Such a program is simply a finite number of lines, each made up of a unique *line number* and an *instruction*. The possible stepwise instructions consist of: *‘increment’* (INC), *‘decrement’* (DEB), or *‘end’* (END). Increment instructions are followed by two numerical variables as follows; INC* a b*, where ‘*a*’ denotes the address of the register to be incremented (add 1), and ‘*b*’ represents the step/line number, to be executed next. Decrement instructions are followed by three numerical variables as follows; DEB *a b c*, where ‘*a*’ depicts the address of the register to be decremented (subtract 1), ‘*b*’ represents the step/line number to be executed next, and ‘*c*’ corresponds to the step/line number to branch to and execute if there are no more items left to take away from the given register. Lastly, the end instruction which simply reads END, commands the register machine to *halt*, or terminate the program.

With these basic rules for a register machine, one may construct a program to carry out a specific *function*. For example, one may build a program which adds the contents of one register to another by implementing a simple *algorithm* for addition. Namely, giving the program an initial condition, that is, placing a specified number of items in two registers, one may find their sum by implementing an *effective procedure* (irrespective of size) as follows; First attempt to take one item away from one of the registers (decrement). If successful, add one to the other register (increment). Repeat this step until the first register is *empty*, at which point the sum of the two registers will be shown in the second register. Once no more items may be taken away from the first register, halt, or end the program. Carrying out this algorithm, or effective procedure, given any two numbers *x* and *y*, is what the program [ADD] does. This is equivalent to saying that [ADD] *‘computes’* addition. Therefore, “for any given combination of a register machine program and an initial state, the sequence of applications of the program is a *computation*” (Carter, 76). Precisely, performing these computations is the sole purpose of register machines. Informally, computations just are operations of register machine programs. There are certain procedures however, that are not computable, such as the *halting problem*. As there exists no effective procedure for determining whether a given program will halt (or run indefinitely), this is an example of something which cannot be computed. Opposingly, specifying what can be computed requires a formal understanding of functions. A function *f(x _{1}, x_{2}, …, x_{n}) = y*, is a mathematical correlation between some fixed number of inputs

*n*, and a unique output

*y*. Basic arithmetical operations such as addition, subtraction, etc, are examples of such functions. Precisely, functions that are computable via a register machine’s algorithmic processes (program). These functions are the objects of computation. If [ADD] returns the appropriate value corresponding to the inputs, or initial conditions,

*x* and

_{1}*x*, then the program is said to ‘compute’ the function for addition, namely,

_{2}*f(x*. Therefore, “a function is computable iff there is a register machine program which computes it” (Carter, 78).

_{1}, x_{2}) = *x*_{1} + x_{2}Both Alan Turing and Alonzo Church provided similar accounts for computability which gave rise to the Church/Turing thesis. While Turing “argued that all and only those procedures which were algorithmic could be computed by Turing (register) machines”, “Church argued that the informal notion of effective calculability could be understood (at least for calculations over positive integers) in terms of the formal notion of a recursive function” (Carter, 87). Both accounts married the informal notion of algorithmicity, or effectivity, with the formal definition of computable functions to form a thesis of equivalence, which can be generalized as follows; “all and only effective procedures are computable functions” (Carter, 87). In other words, a function can be calculated via an effective procedure iff it is computable by a register machine. Despite being a widely accepted thesis, the equivalence between effective procedures and computable functions is not a proven one, as there is no formal account of algorithmicity/effectivity. As mentioned previously with the halting problem, not all functions are computable. Nonetheless, only functions are computable. Of these computable functions, all are effective procedures. Yet, it is not formally proven that only computable functions are effective procedures. In spite of the seemingly unmistakable nature of the Church/Turing thesis, we may not hold this to be true until we are given a formal definition of algorithmicity/effectivity. However, this is not to say that the thesis has not done its job of defining computability. It has, in fact, given us an answer to the question; What does it mean for something to be computable? Precisely, it means to be algorithmically solvable by a register machine. We may now confidently say that for every computable function there is at least one register machine capable of computing it. This alone, is a major breakthrough for artificial intelligence.

Reference: Carter, Matt. *Minds and Computers: an Introduction to the Philosophy of Artificial Intelligence*. Edinburgh University Press, 2009.