Charles Petzold



“The Imitation Game” and Alan Turing’s Real Contribution to Computing

December 11, 2014
New York, N.Y.

As Alan Turing (portrayed by Benedict Cumberbatch) races against the clock to build a machine to crack the Nazi Enigma code in the recent movie The Imitation Game, only Joan Clarke (Keira Knightley) understands the underlying quest of this tortured genius.

“You’re trying to build your Universal Machine,” she says [about 44 minutes into the movie], alluding to Turing’s 1936 paper on Computable Numbers. “You theorized a machine that could solve any problem. It didn’t just do one thing; it did everything. It wasn’t just programmable. It was reprogrammable,” to which Turing grunts an assent.

But the Turing character really shouldn’t agree, for two reasons:

First, contrary to what the Joan Clarke character says — or rather, contrary to the screenplay by Graham Moore — the machine portrayed in the movie being built at Bletchley Park (which was called a "bombe" but which the movie ludicrously names "Christopher") does not qualify as a Universal Machine. It a special-purpose machine rather than a general-purpose computer.

Secondly, although the hypothetical Universal Machine that Turing constructed in his 1936 paper is recognized today as an abstract model of digital computing, it cannot solve every mathematical problem. Some problems are forever out of its reach, and they remain impossible even on today’s sophisticated and powerful computers.

We might excuse a Hollywood biopic such as The Imitation Game for lapses of mathematical accuracy (although the inaccuracies of the movie go far beyond the mathematical), but the same problem pops up in the 2011 British television documentary released in the U.S. as Codebreaker. Here we are told [about 17-18 minutes into the movie] that “Turing’s Universal Machine was purely hypothetical but it laid out the fundamental principle underpinning all computers—that any conceivable mathematical calculation can be done by a single device shuffling 1’s and 0’s back and forth…. And part of Alan Turing’s genius was to realize that a machine like this can compute absolutely anything, because anything can be written as 1’s and 0’s.” Not true! Computers cannot compute everything. They have built-in restrictions that are intrinsic to the mathematics of digital computation.

Everyone agrees that Alan Turing is a seminal figure in the computer revolution, so it's puzzling why these movies deliberately misrepresent his work. We should at least make an attempt to get it right, and particularly get a handle on Turing's most significant contribution to computer science, the 1936 paper “On Computable Numbers, with an Application to the Entscheidungsproblem.”

The Entscheidungsproblem or “decision problem” was posed by German mathematician David Hilbert in the 1920’s. Hilbert was working with a form of mathematical logic, and he asked: Is there a general procedure that can tell us whether any arbitrary statement is this logic is true or false?

Today we call such a general procedure an “algorithm,” a methodical mathematical recipe formulated to be independent of the person who carries it out. The mechanical nature of such processes perhaps suggested to Turing a unique approach for answering Hilbert’s question: Turing imagined the actions of a person carrying out an algorithm as a machine that performs simple rudimentary steps of writing and erasing symbols on paper based on a history of previous steps. Each algorithm is implemented in what Turing called an “automatic machine” but which today we call a Turing Machine.

By formalizing algorithms as sequences of simple uniform steps, Turing could assign each algorithm a unique number that describes these steps. Turing then created a special Turing Machine called the Universal Machine. If you input the number for any Turing Machine into the Universal Machine, it will carry out that algorithm.

At the time of Turing's paper, no working digital computer had yet been built. Yet the concepts are all there: A Turing Machine is analogous to the bits and bytes of a computer application, and the Universal Machine is the computer that reads and runs these apps. Only in retrospect after actual digital computers had been built was it evident that Turing had already established a deep understanding of the mathematical foundations of computing. Turing’s work implies that all digital computers and programming languages are fundamentally equivalent in their algorithmic capabilities.

Yet, the real intent of Turing’s paper on Computable Number was to demonstrate that these machines — and hence human beings carrying out algorithms — were inherently limited in what they could do. There is no general procedure for the Entscheidungsproblem, and the problem goes deeper.

In particular, one important category of Turing Machine is impossible: You cannot define a Turing Machine that can determine if other Turing Machines are working correctly.

Translated to modern terms, this has unnerving implications: Computer programs often contain flaws or errors called bugs, and the work of debugging software is a vital part of a programmer’s job. But this is a task that cannot be automated. You cannot write a generalized computer program that will find all the bugs in other computer programs. (If there were such a thing, the world wouldn’t have held its breath as the year ticked from 1999 to 2000.) The problem is that you must use this program on itself, and because you don’t know if this debugging program is itself entirely free of bugs, you can’t entirely trust it.

Or, to rephrase the problem in the form of an age-old question: Who will debug the debugger?

So to say that Turing’s Universal Machine can “solve any problem” or “compute absolutely anything” is simply wrong, and the statement is wrong for today’s digital computers as well. For the makers of The Imitation Game and Codebreaker to ignore this fact is not just a simplification for the non-mathematically inclined. It is a fundamental misrepresentation of the capabilities and limitations of digital computers, and either a deliberate or ignorant failure to describe Alan Turing's work.

Almost 80 years ago, before a single digital computer had been built and before any software for these computers had been written, Alan Turing demonstrated that we can never be certain that such software works correctly. As we increasingly use computers and software to run this world, of course we’d like some assurance everything is working OK. But that’s not possible. A perpetual state of uncertainty is now yet another aspect of the human condition. This is Turing’s real contribution to computing, and it should not be a secret only among computer scientists.

Of course, it’s not Turing’s fault. That’s just the way the math works out: Buggy software is as inevitable as death and taxes.

Wiley Amazon US Barnes & Noble
Amazon Canada Amazon UK Amazon Deutsch
Amazon Français Amazon Japan Blackwell
“Petzold will be a stalwart companion to any reader who undertakes to read Turing's classic with his aid. The Annotated Turing will also be quite enjoyable to a more casual reader who chooses to dip into various parts of the text.” — Martin Davis in American Scientist