Turing's Cathedral : The Origins of the Digital Universe, Paperback

Turing's Cathedral : The Origins of the Digital Universe Paperback

3 out of 5 (1 rating)


George Dyson's fascinating account of the early years of computers: "Turing's Cathedral" is the story behind how the PC, ipod, smartphone and almost every aspect of modern life came into being.

In 1945 a small group of brilliant engineers and mathematicians gathered at the Institute for Advanced Study in Princeton, determined to build a computer that would make Alan Turing's theory of a 'universal machine' reality.

Led by the polymath emigre John von Neumann, they created the numerical framework that underpins almost all modern computing - and ensured that the world would never be the same again.

George Dyson is a historian of technology whose interests include the development (and redevelopment) of the Aleut kayak.

He is the author of "Baidarka"; "Project Orion"; and "Darwin Among the Machines". "Unusual, wonderful, visionary". (Francis Spufford, "Guardian"). "Fascinating...the story Dyson tells is intensely human...a gripping account of ideas and invention.

Fascinating...the story Dyson tells is intensely human...a gripping account of ideas and invention." ("Jenny Uglow"). "Glorious...as much a story of the personalities involved as of the discoveries they made, and you do not need any knowledge of computers or mathematics to enjoy the ride. ..a ripping yarn". (John Gribbin, "Literary Review").


  • Format: Paperback
  • Pages: 432 pages
  • Publisher: Penguin Books Ltd
  • Publication Date:
  • Category: History of mathematics
  • ISBN: 9780141015903



Free Home Delivery

on all orders

Pick up orders

from local bookshops


Showing 1 - 1 of 1 reviews.

Review by

A fascinating and illuminating book, but also a frustrating one because it should have been a lot better than it is.The heart of the story is more or less on target – a collection of very interesting anecdotes and narratives about the personalities involved in building America's first computer, at Princeton's Institute for Advanced Study after the Second World War. Leading the team was the quite extraordinary figure of John von Neumann, about whom I knew rather little before reading this. He comes across as by far the most brilliant mind in these pages (not excluding the presence of one A. Einstein), with a near-eidetic memory, an ability to understand new concepts instantly and make staggering leaps of reasoning to advance them further. Not a very endearing character, though – a refugee from 1930s Europe, he pushed the nuclear programme hard and argued to the end of his life that the best way to create world peace was to launch a full ‘preventive’ hydrogen bomb strike against the USSR, mass civilian deaths and all.The nuclear project was central to the invention of the computer. The first incarnation of the machine (the ‘ENIAC’, later nicknamed ‘MANIAC’) was developed specifically to model fission reactions, which involve some rather tricky maths. But von Neumann and other thinkers realised early on that a machine capable of doing that would also be able to fulfil Alan Turing's description of a ‘universal computer’: if it could do the arithmetic, it turned out, it could do practically anything else too, provided there was a way of feeding it instructions. <i>‘It is an irony of fate,’ observes Françoise Ulam, ‘that much of the hi-tech world we live in today, the conquest of space, the extraordinary advances in biology and medicine, were spurred on by one man's monomania and the need to develop electronic computers to calculate whether an H-bomb could be built or not.’</i>What is particularly fascinating is how these studies gradually let to a blurring of the distinction between life and technology. The development of computing coincided with the discovery of DNA, which showed that life is essentially built from a digital code (Nils Barricelli described strings of DNA as ‘molecule-shaped numbers’), and early programmers soon discovered that lines of code in replicating systems would display survival-of-the-fittest type phenomena. This is entering a new era with the advent of cloud-sourcing and other systems by which computing is, in effect, becoming analog and statistics-based again – search engines are a fair example.<i>How can this be intelligence, since we are just throwing statistical, probabilistic horsepower at the problem, and seeing what sticks, without any underlying understanding? There's no model. And how does a brain do it? With a model? These are not models of intelligent processes. They </i>are<i> intelligent processes.</i>All of this is very bracing and very interesting. Unfortunately the book is let down by a couple of major problems. The first is a compulsion to give what the author presumably thinks is ‘historical colour’ every two minutes – so the IAS is introduced by way of an entire chapter tracing the history of local Algonquin tribes, Dutch traders, William Penn and the Quakers, the War of Independence – all of which is at best totally irrelevant and at worst a fatal distraction.The second, even more serious failing is that the technology involved remains extremely opaque. One of the most crucial parts of this story should be following the developments of cathode-ray storage, early transistors, the first moves from machine language to codes and programs. But the explanations in here are poor or non-existant. Terms like ‘shift register’, ‘stored-program computer’, ‘pulse-frequency-coded’, are thrown around as though we should all be familiar with them.My favourite story to do with the invention of the digital world involves Claude Shannon and his remarkable realisation – one of the most civilisation-altering ideas of our species – that electronic transistors could work as logic gates. It's not even mentioned in this book. And so the crucial early building blocks of what a computer actually <i>is</i> – how, on a fundamental level, it really does what it does – that is all missing. And it's a pretty serious omission for someone that finds it necessary to go back to the Civil War every couple of chapters.A lot of reviews here, especially from more technical experts, really hate this book, but on balance, I'd still recommend it. It's a very important story about a very important period, and the later chapters especially have a lot of great material. But reading around the text will probably be necessary, and this book should have offered a complete package.