World War II code breakers developed your smartphone

British mathematician and logician Alan Turing conceived of the idea of the universal computer. Here, a statue of Turing created by artist Stephan Kettle in Bletchley Park, puzzling with an Enigma machine.

Image of Turing's Cathedral: The Origins of the Digital Universe
Author: George Dyson
Publisher: Pantheon (2012)
Binding: Hardcover, 432 pages

Before there was Mark Zuckerberg, before there was Steve Jobs, before there was Bill Gates, there was Alan Turing. Turing was a British mathematician and logician, and he had some ideas about computers. “The entire digital world we live in can be traced in large part to his ideas,” says George Dyson, author of "Turing’s Cathedral: The Origins of the Digital Universe."

Turing, says Dyson, “conceived of the idea of the universal computer, a computer that can solve any problem that can be solved and he did that as a mathematical proof that would seem to have no implications in the real world, and within 10 or 20 years, it had huge implications.”

Alan Turing worked as a code breaker for England during World War II. That was a bit like being a hacker before there was an Internet, trying to get past security and get the data you want. In this case, from the Axis powers.

“Suddenly the question of cryptography, of encoding and decoding digital messages, actually became one of the central things to outcome of the war,” Dyson says, “whether you could have unbreakable codes on your side and break the enemy's side, and Turing had a huge influence in the outcome, in that our side, speaking for the British and the Americans, did break enemy codes.”

Dyson's book tells the story of the men and women who took Turing's ideas of a machine that could compute any sequence and they set out to actually build it. The team was led by John von Neumann at the Institute for Advanced Study in Princeton, New Jersey. Obviously, it's a long story how they pulled it off. I can't summarize it here.

Dyson says that work in the 1940s created our world today, where computers are everywhere, and that happened because of the speeds by which data could be processed.

“They built a digital matrix,” he says, “and they structured it in a way so that if you put in 20 bits, you get 40 bits back. That's like one plus one equals four, rather than one plus one equals two, so you got out more bits than you put in, and the moment they switched that thing on, the world changed.”

That ever increasing power led to machines with better memory running software that could do more. And it kept multiplying to the point where we now Windows and Macintosh and Google and Facebook and on and on. Because of code.

“It doesn't cost anything to replicate code,” says Dyson. “So the companies that make code, that's why they've done so well. We take it for granted now, but why is it that code is free? It’s because somebody built this self-replicating process.”

I asked Dyson if the way we use the term “code” today could be traced straight back to World War 2 and the work that Alan Turing was doing. “Yes,” he says. “The two are directly related.”

Also on today’s program, giddyap little robotic doggies, it’s the Robot Roundup. We learn about robots mapping the ocean floor for Titanic debris, putting out fires on submarines, and setting new robot land speed records without the benefit of a head. Not all at the same time, of course, though that would be pretty rad.

About the author

John Moe is the host of Marketplace Tech Report, where he provides an insightful overview of the latest tech news.
Image of Turing's Cathedral: The Origins of the Digital Universe
Author: George Dyson
Publisher: Pantheon (2012)
Binding: Hardcover, 432 pages
Log in to post3 Comments

Well Charles Babbage certainly envisaged the idea of the computer with his Analytical Engine but he never wrote it down in an abstract enough way to enable future generations to build either mechanical, electro-mechanical or electronic computers. Maurice Wilkes, a distinguished British pioneer of computing, who knew Turing wrote in 1971 to mark the centenary of Babbage's death, "[Babbage] however brilliant and original, was without influence on the modern development of computing". Indeed, Wilkes goes further, arguing that Babbage's costly and very public failure caused the British to shun anything to do with mechanical calculation for almost a century.
So it's true in one sense but in a more direct sense we can trace the lineage of every modern computer directly to Turing. How do I know all this? I've just written a book called "The Universal Machine - from the dawn of computing to digital consciousness" which starts with Babbage and goes via Turing to the present day and into the future of computing http://www.theuniversalmachine.com and http://universal-machine.blogspot.com

Sorry, guys, you get an F for research here. While Alan Turing is a key figure in the history of computing, he is NOT the inventor of the concept of "a universal computer".

That honor goes to Charles Babbage's Analytical Engine in the early 1800's. A recent article in the Economist titled "Babbage's Last Laugh" (http://www.economist.com/node/324654?story_id=E1_PNQGVQ) takes one through the evolution of universal computation as a concept. It also introduces the fascinating Ada Lovelace.

Self-replication? Nonsense. The reason we can duplicate code for free is that it's stored in an abstract way, as voltages. When we want it to be stored in a concrete way, like CDs, the cost becomes noticeable.

Both of these methods have been around LONG before the Enigma project.

Free duplication of abstractly-stored code has always been possible through an obscure and now-forgotten method called "speech". When I use words to tell you how to perform a task, and you remember how to do it, you have duplicated the code for this task at zero cost.

Cheap duplication of concretely-stored code is less ancient but still fairly old. Think of player piano rolls.

With Generous Support From...