Home » Computers » Alan Turing

Alan Turing

I have always been fascinated with mathematical/engineering minds.

Perhaps that’s because I don’t have one, and I’m ever curious how it would feel to approach a problem that way.

I recall visiting a company once, many years ago now, where the job was to etch computer chips. While I won’t try to recite the process here as I’d probably get it wrong, my non-engineering brain would assume that one would lay out a pattern of circuits on the material (in this case, a silicon wafer) and adhere them to it, or dig them into it. Needless to say, an engineer’s mind worked the problem from a different approach, coating the chip with a photoresist, and then exposing chip in the pattern of the circuitry to light so that the pattern was created where the light could not penetrate the surface. Or something like that! What surprised me most of all was the approach, being the opposite of what I might have imagined.

I also recall an early discussion with a math teacher, trying to figure out the “why” of algebra. Oh, sure, I could comprehend trying to reckon the so-called “word problem,” in which a situation was presented such as, “if you had 10 guests at a party, and each guest needed a plate for a 6-course meal, how many plates would you need altogether?” But solving an equation simply for the purpose of numbers evaded my story-telling mind. What was the point? What was I trying to accomplish?

Computers and programming presented me a similar problem. I learned to program back at a time when computers were well past their first incarnation, but not nearly so sophisticated as they are today. There were essentially two purposes for these early machines: solving complex mathematical problems by sheer force of speed and repetition; and performing boring and mundane tasks (changing the date on thousands of files, for example, or adding $5 to a billing that would be sent to 10,000 customers, recording its payment, and rebilling those who failed to pay) to save human beings the time and energy. Again, the second I could understand, and “talking” to the computer was along the lines of learning another language – literally – to speak to this alien creature and tell it very precisely what you wanted of it. The first assignment required you know something about the mathematical problem you were trying to solve.

Now, the idea of a computer goes back, in some ways, to marks on sticks, to the abacus. For every 5 sheep that ran past you through a gate, you marked a stick. Then by simply counting the marks after all the sheep had passed by, and counting 5 for each mark, a total number could be obtained. An abacus allowed for something similar to our “1’s place, 10’s place, 100’s place” notion of math. Each bead on a string across a row of strings represents a unit, and large numbers can be represented by the number and location of the beads. (Keep in mind this is very simplified!)

Still, these operations were performed by a certain amount of counting and a certain amount of simple “marking” of the counting. Math quickly advanced far beyond the needs of keeping track of numbers of things, and moved into distances, predictions, and abstractions.

Which brings me to one of the most fascinating, and certainly ground-breaking, mathematical minds of the 20th century: Alan Turing.

I had heard, of course, of the Turing Machine. This was ultimately a physical machine, and Turing is often referred to as the creator of the general purpose computer. His idea was that such a machine could be created, and that it would perform the tasks of any other computation machine – and store the information.

Turing graduated at King’s College, Cambridge, with a degree in mathematics, where he published a proof demonstrating that some purely mathematical yes–no questions can never be answered by computation. In 1938, he was awarded a PhD from the Department of Mathematics at Princeton University. During the Second World War, Turing worked for the Government Code and Cypher School at Bletchley Park, Britain’s codebreaking center. As the leader of “Hut 8,” he devised a number of techniques for speeding the breaking of German ciphers, including improvements an electromechanical machine that could find settings for the Enigma machine. The Enigma machine was the German’s means of sending coded messages – and these messages were considered “unbreakable” until Turing put his machine to work.

In a post-graduate paper published in November and December, 1936, Turing had replaced mathematician Kurt Gödel’s universal arithmetic-based formal language with the form and simple (but hypothetical) devices that became known as “Turing machines.”

Though the goal was to “break” the Enigma code, Turing’s work continued to revolve around the idea of using a machine to perform the routine work of mathematics, ultimately proposing the idea of “artificial intelligence.” One wonders how he would feel about today’s AI – if only we could find a way to tap into his insights! He did propose an experiment that became known as the “Turing test,” which was a means of determining if a machine could be called intelligent. The test proposed that if a human could not tell the difference, in a conversation, between a human being and the machine being tested – the machine was, indeed, intelligent. In fact, we use a form a Turing test with today’s CAPTCHA test. Find all the pictures with a street light out of nine presented to you – some of them requiring a bit of study – and you prove yourself “human.”

To suggest that Turing was only a mathematician, or only a cryptanalyst, is to barely scratch the surface of this highly “enigmatic” genius. Born in 1912, and dying (it was suggested of suicide, having been found guilty of what was at that time considered “public indecency,” or homosexual behavior) in 1954, Turing left indelible marks in logic, philosophy, and theoretical biology as well.

So today, as we use computers in virtually every aspect of day-to-day life – relying on them to perform the needed calculations to advise, entertain, track, and store information – we are, at least in part, tapping into the brilliance of Alan Turing and his notion that a machine could (or could not) do it.

Nancy Roberts