Charles Petzold



Computers, Randomness, and Alan Turing

December 6, 2007
Roscoe, N.Y.

Jeff Atwood's recent and earlier blog entries remind us of one numerical job that computers really stink at: generating random numbers.

I suspect that Alan Turing was one of the first people to realize this. His famous 1936 paper "On Computable Numbers, with an Application to the Entscheidungsproblem" begins by discussing how to specify the actions of computing machines to calculate real numbers. He quickly determines that the numbers computable by all such machines — which he classifies as the "computable numbers" — are merely an enumerable subset of the real numbers. This is easily provable, but also understandable when one considers that the vast majority of real numbers are simply strings of truly random digits, so how are you going to get a completely deterministic machine to compute them?

For example, if you tried to compute the digits of a number using a typical pseudo-random number generator such as the rand function in C or the System.Random class in the .NET Framework, you'd eventually reach the end of the cycle and start repeating digits, at which point you're merely calculating a rational number.

Skip ahead 12 years. Alan Turing is a Reader in the Department of Mathematics at Manchester University under M.H.A. ("Max") Newman, the former Cambridge mathematician whose class in the Foundations of Mathematics Turing had attended in 1935, and who guided Turing's classic paper to publication. Both Newman and Turing are heavily involved in one of Manchester University's seminal computer projects, the Ferranti Mark I developed between 1949 and 1951. (Several years later Max Newman wrote Turing's obituary for the Royal Society and outlived Turing by 30 years.)

One peculiar feature of the Mark I was this:

Wouldn't it have been interesting for Steve Wozniak or Don Estridge to have also decided that every computer needed a hardware random number generator, and for that feature to have become a standard part of the machines we use today?


1John von Neumann, Collected Works, Volume V, Design of Computer, Theory of Automata, and Numerical Analysis (Macmillan, 1963), 768. The statement was originally made at a symposium on Monte Carlo methods in 1949.

2Martin Campbell-Kelly, “Programming the Mark I: Early Programming Activity at the University of Manchester,” Annals of the History of Computing, Vol. 2, No. 2 (April, 1980), 136.


Coming May 19, 2008!

Available for Pre-Ordering
Wiley Amazon US Amazon Canada
Amazon UK Amazon Deutsch
Amazon Français Amazon Japan Blackwell