Nazis, Human Calculators, and the Future that Failed: The Hidden History of Early Computing

Steven Schkolne
11 min readFeb 19, 2019

--

Konrad Zuse poses in front of a replica of one of his early machines.

THE COMPUTER
From Pascal to von Neumann
By H. H. Goldstine
Princeton Press, 1972
amazon link

I wrote my first program when I was 8, went on to earn multiple degrees with the word “computer” in them, teach at top universities, and work professionally for over a decade in high tech. You would expect my grasp of computing history to be solid.

H. H. Goldstine’s “The Computer: From Pascal to von Neumann” peeled back my eyelids. I had accepted a tidy narrative hook line and sinker. Computing history is not what I imagined it to be.

The story we repeat — the Babbage history of computing — is woefully inadequate. We learn that in the 19th century, Babbage dreamed up a digital computer that performed arithmetic feats with a myriad of switchlike devices. He designed the machine with painstaking attention to detail. Ada Lovelace stepped in to show how it could compute Bernoulli numbers. Sadly, Babbage died before his machine could be built. Babbage’s ideas died along with him. We’re told that, about a century later, the ENIAC picked up this near-forgotten thread of development to revolutionize the world.

Goldstine’s generational perspective and first-hand experience come together to present a deeper and far more intriguing history. Goldstine went to college in the 1930s, and worked hand-in-hand with Von Neumann to develop the ENIAC in the 1940s. The digital computer is no inevitability, but rather the survivor of a fierce competition between variegated technologies.

In lieu of a traditional book review, I present to you the insights I found most provocative.

A collection of human “computers”, ca 1949

1. The First Computers Were People

Before computers as we know them were widespread, there were people called “computers”, whose job it was to perform computations.

The term actually dates from the 18th century. Originally applied in an astronomical context to describe those who performed the lengthy calculations required to predict comets and the like, by the mid-20th century being a computer was a well-established occupation. Beyond astronomy, human computers were essential to advances in physics, engineering, and warfare.

Armed with slide rules, electrical/mechanical calculators, pencils and papers, humans computed away. It should come as no surprise that the job of being a “computer” is the first that digital computers rendered obsolete. Of course, many of these human computers went on to become programmers and drive the early machines to even greater heights of computation.

Analog computers, such as this differential analyzer, were the best way to compute ballistics tables prior to the development of digital computers.

2. Ballistic Computation Was the First Killer App

Consider the mortar that evolved during World War I, and similar weapons which fire a projectile to a distant target. How did soldiers in the field aim such a device? A firing table was needed to guide them. Using the distance from the target, difference in elevation, and other factors, soldiers looked up the correct angle to use when aiming.

Goldstine worked for the U.S. Army, and got involved in computer history because he was tasked with finding a method to speed the calculation of these firing tables. The problem was significant. Lack of computation was quite a pain point.

The average table contained about 3000 entries. Each entry required about 750 multiplications, not to mention some additions and subtractions, all with a precision of 5 parts in 10,000. In practice, it took about 12 hours for a human to determine a trajectory using the kind of desk calculator that was available in 1940. That’s 375 days to fill out a single table. And that’s just for one type of gun, with a certain type of shell, and a certain type of explosive.

By 1940 an analog computer called a differential analyzer had reduced the calculation of a table to a mere 31 days of round-the-clock work. The process was far from fully automated, and the military needed a full set of tables for the multitude of shells and explosives used for each weapon. The potential to diminish this burden drove early wartime investment in computing.

It wasn’t until 1947 that the Harvard Mark II bested the older analog computing technology by achieving a multiplication rate of two operations per second. By that time, digital computers had been used for an unexpected military application, also related to explosives: determining the feasibility of the world’s first hydrogen bomb.

An early prototype of the world’s first digital computer, as seen in Konrad Zuse’s parents’ living room

3. The First Programmable Digital Computer Was Built in Nazi Germany

The stateside, late-wartime developments of the ENIAC and Mark I are hailed as the moment when Babbage’s dreams were finally realized. But the first substantial computer was built well before either of these landmarks. Between the years of 1936 and 1938, a German named Konrad Zuse built an electronic device, the Z1, which could perform arithmetic.

The Z1 and its successor the Z2 never worked reliably, but by 1941 Zuse had completed the Z3. This machine was reliable and Turing complete, satisfying the official academic criteria for computation. While the German government helped to fund the Z3, they denied funding for a successor, deeming it strategically unimportant. Zuse, unlike the institutional innovators in Britain and the United States, operated as a lone hacker. He built his early machines, not in an academic or government laboratory, but rather in his parents’ bedroom.

Has Zuse been slighted by an unwillingness to attribute this early landmark to the Nazis? This proposition is certainly provocative, perhaps more provocative than true. The ENIAC was early computing’s historical moment: the technology worked to solve real-world problems, demand was higher than supply, and institutions around the planet jumped into the race. Zuse’s device, like other early machines (the Colossus and the Atanasoff–Berry Computer) didn’t find traction in real-world applications. There is a tendency to remember the technology that captures the market, for example the iPhone is heralded while the myriad smartphones that preceded it are diminished in our collective imagination.

On the other hand, nationalist bias is no stranger to history. In the USA, we remember Neil Armstrong, the first moonwalker, rather than Yuri Gagarin, the world’s first astronaut [1]. Does the fact that Geller was sent up by our Soviet rivals have anything to do with this? Other early computers are less deserving of attention. The Atanasoff–Berry Computer, built in Iowa never worked. The Colossus, built in Britain, was kept secret until the 1970s. Zuse’s machine was the definitive achievement. While I find these patterns stimulating, I don’t favor a particular conclusion. I’ll leave it to those with more appetite for social history to hash this one out.

Analog computers are physical manifestations of mathematical systems

4. Why Analog Computers are called “Analog”

In Computer Science and related fields, we think of digital computers as machines that perform discrete math. Analog computers deal with continuous quantities (such as the functions studied in high school calculus). This understanding is embodied by definitions like this one:

analog: of, relating to, or being a mechanism or device in which information is represented by continuously variable physical quantities

This definition is rather new. Originally, the term ‘analog’ was used to describe devices that bring together physical objects analogous to mathematical objects in a formula to perform a computation. Since these devices are really good at performing continuous math, over time we have come to think of analog machines as “continuous math” machines. The great rift in mathematics — between continuous and discrete math — we map to the great difference in these two methods of computation.

To truly understand analog computing, we must ignore for a moment its suitability for one form of math instead of another. We must instead explore what it means to be “analogous”, as opposed to the abstraction seen in the digital machines.

The fascinating Antikythera mechanism indicates that the ancient Greeks invented — along with philosophy, democracy, geometry, and evidence-based medicine — computation. Upon the turn of the crank, this ancient device physically displayed motions of the sun, moon and planets with startling accuracy (eclipses are predicted down to the time of day and color of the eclipse).

The Antikythera mechanism uses dozens of precisely carved gears to create this simulation. The gears are a physical embodiment of the mathematical objects these ancients used to understand eclipse timing: ratios. Turn the crank, and the wheels move at the relatively different rates embodied by these ratios to accurately predict the alignment of celestial bodies.

How does the math of a ratio becomes a physical object? Let us consider an example. New moons happen about every 28 days, a cycle which doesn’t line up with the year precisely. Ancient astronomers knew that, if they observed the heavens for 19 solar years, they would see, at the exact end of that 19th year, the moon aligning with the stars nearly precisely as it had before, having completed 254 rotations. This gear ratio, 254 / 19, is used by components of the Antikythera mechanism to simulate these orbits. Turn one wheel 19 times, and another turns 254 times: a physical analog of the underlying math.

Analog computers of the 20th century took advantage of newly discovered electrical properties to create physical analogs of more complex math. In particular, these devices were surprisingly good at performing derivative and integral operations that are at the heart of calculus. The current across a capacitor at any moment in time is the derivative (rate of change) of the voltage. The current across an inductor is the integral (sum) of the voltage applied to that inductor over a period of time. With a deep understanding of math and much tinkering, circuits can be created to solve differential equations, a heavyweight task on a digital computer or for many a human mathematician, instantaneously. In Goldstine’s words:

This is why these machines are called analog. The designer of an analog device decides what operations he wishes to perform and then seeks a physical apparatus whose laws of operation are analogous to those he wishes to carry out. He next builds the apparatus and solves his problem by measuring the physical, and hence continuous, quantities involved in the apparatus. [p. 40]

The limitations of these machines are due to their analogous nature. As physical models of mathematics, they can’t be repurposed easily. If you want your Antikythera to simulate a different planetary system, you can’t just tweak it a bit, you have to rebuild it entirely. There is no abstraction of function from mechanism as seen in Turing complete, digital computing.

Abstractness is a key differentiator of the digital machine, enabling a wide variety of programs to be built for so many various purposes. This is a big question for a technology like quantum computing. Surely there are interesting, powerful computations that can be performed, most likely at a rate that far surpasses quantum computing. But can these benefits be applied to all problems? Or only a small subset of problems whose nuances are analogous to the underlying quantum-physical device.

An early analog computer, ca 1949

5. Analog Computers Were Once the Future

History has largely forgotten an entire epoch of computing, the analog era. Growing up in the digital age, I was under the impression that the work done before the ENIAC was inconsequential. Goldstine lived in a different time, and saw a very different version of events. From his perspective, coming of age between the world wars, analog computing was an established technology. Digital computing was a risky, disruptive possibility that was far from certain to outperform analog solutions.

For example, consider Lord Kelvin’s tide-predicting machine of 1873. The device used wheels and pegs to simulate the appropriate trigonometric equations. The resulting device could predict tides. In Goldstine’s words

Here we see for the first time an example of a device which can speed up human process by a very large factor, as Kelvin asserts. This is why Kelvin’s tidal harmonic analyzer was important and Babbage’s difference engine was not. [pp. 47–48]

The practicality of these analog machines soon lead to a healthy market, with innovators such as Ford and Newell pioneering techniques for navigation and weaponry. The abovementioned differential analyzer was the fastest way to calculate ballistics tables before the advent of digital computing.

The great success of analog computing explains the absence of MIT from the early days of digital innovation. The institute was busy leading the drive to build the alternate, analog future. Vannevar Bush and Samuel Caldwell, working at MIT in the late 1920s, invented the differential analyzer, a key device for speeding (you guessed it) ballistic computation. The original 1931 incarnation computed with physical moving parts, not electronics. It wasn’t until the early 1940s that circuits were developed to replace many of the original physical analogs. Bush and Caldwell went on to address the biggest weakness of these analog machines, the lack of programmability, in the early 1940s. Digital computing was the technology few believed in until the wonderful successes of the mid-1940s. After which, we see many great digital innovations come from MIT, a long list which includes the development of magnetic memory, TCP/IP, email, not to mention early videogames and virtual reality displays.

Analog computers have not disappeared, there continue to be proponents who argue for a bright future for the technology.

Babbage’s machine, ca 1853, shows decimal numbers on its dials.

6. Early Computers Didn’t Have 1s and 0s

Today we take it for granted that “1s and 0s”, aka binary representations, are the most efficient way to build a machine. While he considered binary, Babbage’s designs all used decimal representations, which have the numbers 0 through 9 that we familiarly use to count and measure. I was surprised to learn that decimal representations continued for quite some time. In Goldstine’s words:

It used to be argued in the mid-1940s that the reason for adopting the decimal system for computers was that the problem of conversion from binary to decimal and vice versa was a major one. [p. 260]

It wasn’t until contemplating the EDVAC that computing moved definitively towards the binary representation. Goldstine and Von Neumann showed a fast technique for converting decimal numbers to binary, and demonstrated how the overall system was simpler if these conversions bookended the computing process. We haven’t looked back since.

If you have more curious insights about the early days of computing, please share in the comments below.

[1] Hardcore nerds who celebrate Yuri’s Night excepted, of course.

--

--

Steven Schkolne
Steven Schkolne

Written by Steven Schkolne

South African/American Caltech CS PhD, turned international artist, turned questioner of everything we assume to be true about technology. Also 7 feet tall.

Responses (4)