History of Computing

1. Earliest Traditions

Man needed to keep track of things - whether it was the number of things, the measure of distance, or weight, or time. He used the digits of his hands as the first counting device. From this was born our decimal number system based on 10.

There were several systems for representing numbers. The Romans used symbols for certain numbers like so:

The number 128 would be written as CXXVIII. Even the simplest arithmetic operation like addition was difficult to perform using Roman numerals. The present number system based on position was widely used in India and was later introduced to the West by Arab traders.

2. Mechanical Counting Devices - Abacus, Napier's Bones, Slide Rule

The abacus was one of the first adding machines. The abacus is made out of beads strung by several wires. The position of a bead determines its value. Thus a few beads are required to represent large numbers. Contrast this to the Roman system of counting where different symbols were used to represent larger and larger numbers.

John Napier (1550-1617), a Scottish mathematician, created logarithm tables to facilitate calculations. He also created a device using rods, also called Napier's bones, to perform aritmetical calculations. These rods were widely used by accountants and bookkeepers.

Several people used the concept of logarithms to develop the slide rule. In particular, mention must be made of a French artillery officer Amedee Mannheim (1831-1906) who introduced the movable double sided cursor on the slide rule. With a modern slide rule you could not only perform the arithmetical operations, you could also calculate squares, square roots, logs, sine, cosine, and tangent calculations. The slide rule was used till the middle 70's.

3. Forefathers of Computing Science

Not much progress was made in the development of machines for computing. Mathematicians developed theorems but calculations were still done by hand. Three individuals whose visions inspired computing machines are - Blaise Pascal (1623-1662), Gottfried Wilhelm von Leibniz (1646-1716), and Charles Babbage (1791-1871).

Pascal invented a machine that had a system of gears. A one-tooth gear engages its single tooth with a ten-tooth gear once every time it revolves. It must make ten revolutions to rotate the ten-tooth gear once. Numbers could be entered and cumulative sums obtained by cranking a handle. Pascal's calculator could handle the carry digit in addition but could not subtract. His calculator was not a commercial success because these devices could not be built with sufficient precision for practical use.

The German mathematician, von Leibniz, studied Pascal's designs and produced a machine that could multiply. The machine consisted of a setup mechanism to enter the digits of the multiplicand, a handle to crank for each digit of the multiplier, a result register, and a system of gears to facilitate the computation. His calculator used the shift-and-add procedure used for multiplication on present day digital computers. The problems that Leibniz faced in the construction of his calculator were common to inventors of his time - poor materials and poor workmanship. In later years other mechanical calculators followed that were refinements on the designs of Pascal and Leibniz.

Charles Babbage realized that long computations consisted of operations that were regularly repeated. He decided to build a difference engine that would be fully automatic, steam driven, and print tables of numbers. This machine was designed to evaluate polynomials for the preparation of mathematical tables. Unfortunately, the metal-working technology of his time was not sufficiently advanced to manufacture the required precision gears and linkages.

When that failed, Babbage decided to build an analytical engine. Analytical engine was a real parallel decimal computer which would operate on words of 50 decimals and was able to store 1000 such numbers. The machine had several components - components to store data and intermediate results, components for input and output of information, and for the transfer of information between components. The operation of this machine was through punched cards. Babbage's analytical engine was never built in his lifetime but several of his concepts were used in the design of latter day computers.

4. Logic Machines

In the mid nineteenth century Augustus de Morgan (1806 - 1871) and George Boole (1815 - 1864) showed that logical propositions could be treated algebraically. This led to machines that could mechanically process logical propositions using these algebraic laws. The best known of them all was the logic piano by William Jevons. An electrical version of Jevon's logic machine was built in 1947 by two Harvard undergraduate students William Burkhardt and Theodore Kalin.

Logic machines did not have any practical significance but they reinforced the relationship between logic and computing. They also paved the way for an important theoretical paper on computing. In 1936, Alan Turing wrote a paper On Computable Numbers in which he described a hypothetical device - a Turing machine. The Turing machine was envisioned to perform logical operations and could read, write, or erase symbols written on squares of an infinite paper tape. This kind of machine came to be known as Finite State Machine. At each step of the computation, the machine's next action was determined from a finite list of possible states. Turing's purpose was not to invent a computer, but rather to describe problems which are logically possible to solve. The Turing machine has some characteristics with modern day computers - the infinite tape can be seen as the internal memory of the computer that one can read, write, or erase. Another landmark paper by Alan Turing was Computing Machinery and Intelligence where he explores the question Can machines think?

Close correspondence between circuits and logic was first suggested in Russian literature by Paul Ehrenfest in 1910. This was followed by work done in 1934 by V.I.S. Sestakov and in 1936 in Japan by Akira Nakasima and Masao Hanzawa. However, the paper that received the most attention was by Claude Shannon in 1938 for his master's thesis at MIT.

5. Mechanical and Electrical Calculators

In 1890 Herman Hollerith developed a device which would read census information which had been punched into cards. Stacks of punched cards could be used as an accessible memory store of almost unlimited capacity. He started his own company to market this device. This company came to be known as International Business Machines (IBM). Most of the data processing in the twentieth century was done using punched cards. Punched cards were not only used to store data but also programs.

In the early twentieth century mechanical calculators were being replaced by electrical ones. These machines used electric circuits and motors to do complex calculations. The key element in these calculators was the electromagnetic relay. The relay was basically a switch that allowed an electric current to pass through when it received a signal. (Early telegraph and telephone devices used relays to transmit information.) In the mid 1930's relays were used by at least three experimenters in the building of electro-mechanical calculators. They were Konrad Zuse in Berlin, George Stibitz in New York, and Howard Aiken in Cambridge, MA.

In a radical departure from other developers of calculating machines, Konrad Zuse used binary representation of numbers in the internal computation of his machine that was designed to solve complex engineering equations. In 1941, he completed the Z3 that used 1800 relays to store sixty-four 22-digit binary numbers and there were 600 additional relays for calculating and control units. Instructions were fed to the computer on perforated 35-mm movie film.

George Stibitz, working as a research mathematician at Bell Labs had little knowledge of Konrad Zuse's work. In 1939 he built a Complex Number Computer that performed multiplication and division on complex numbers. The novelty of this computer was that it was accessed remotely using a teletype machine. After United States entered the Second World War, Bell Labs became more interested in problems that had immediate applications in defense. They built five digital relay computers for the military. The largest computer in this series was Model V that contained 9,000 relays and handled numbers expressed in scientific notation. Model V was a general purpose calculator and solved a variety of numerical problems. The program and data were fed to the computer using paper tapes.

Howard Aiken and engineers at IBM developed a mechanical digital computer in 1944, the Harvard Mark I. It was 51 feet long, 8 feet tall, and only 2 feet deep. There was drive shaft that ran along the base. This machine used relay circuits for internal computation and punched paper tape for instructions and data. It handled 23 digits decimal numbers and could perform all four arithmetic operations. The machine was used by the United States Navy for doing classified work. One of the programmers was a recently commissioned Naval officer Grace Hopper. It was she who found a moth trapped between two relay circuits causing the machine to malfunction. She removed the moth and attached it to her logbook noting that she found the bug that was causing the problem!

There were several generations of the Harvard Mark computers. But by the end of the forties engineers realized that they had reached the limits of the relay circuit technology and that a switch needed to be made to vacuum tubes.

6. Electronic Computers

Several people independently developed electronic computers. These computers used vacuum tubes instead of electromagnetic relays. Vacuum tubes can act both as amplifiers as well as switches. It is their switch property that is utilized in the design of computers.

The first fully electronic computer was developed by John Atanasoff at Iowa State University with the help of his assistant Clifford Berry. It used capacitors to store numbers. The capacitors had to be refreshed because of charge leakage. This was a forerunner of the concept of dynamic memory of modern computers. It was designed to solve linear systems of equations but intermittent malfunctioning prevented it from being used regularly.

Mention must be made of special purpose calculators that used vacuum tubes circuitry. One was developed in Germany in 1941 by Helmut Schreyer (a friend of Konrad Zuse) to three-digit decimal numbers to and from binary. Another special purpose electronic calculator was developed in England at Bletchley Park called the Colossus. The Colossus was a joint effort by many people. The Colossus made Boolean comparisons between two strings and was used specifically to decode German messages.

At the University of Pennsylvania, John W. Mauchly and J. Presper Eckert developed ENIAC (Electrical Numerical Integrator and Computer) that used a word of 10 decimal digits instead of binary digits. It had nearly 18,000 vacuum tubes. It had punched card input and output. It was "programmable" by rewiring the machine.

7. Programmable Electronic Computers

John von Neumann showed that a computer could have a simple, fixed structure and yet be able to execute any kind of computation given properly programmed control without the need for hardware modification. This special type of machine instruction called conditional control transfer permitted the program sequence to be interrupted and reinitiated at any point. The program instructions were stored with the data in the same memory unit. Instructions could be modified the same way as data.

As a result of these techniques computing and programming became faster, more flexible, and more efficient. Instructions in subroutines did most of the work. Frequently used subroutines were kept in libraries. The first generation of modern programmed electronic computers to take advantage of these improvements appeared in 1947.

These computers used Random Access Memory. Input and output were done through punched cards and programming was done in machine language. These groups of machines included EDVAC and UNIVAC, the first commercially available computers. EDVAC (Electronic Discrete Variable Automatic Computer) was a vast improvement over the ENIAC. The program was stored inside the computer. EDVAC had more internal memory provided through use of mercury delay lines. EDVAC used binary numbers rather than decimal, thus simplifying the construction of the arithmetic unit.

8. Transistors in Computers

Two devices that were invented in the 1950s would revolutionize the field of computer engineering. The first of these devices was the transistor invented in 1947 by John Bardeen, Walter Brattain, and William Shockley of Bell Labs. Previously vacuum tubes were used in computers. Vacuum tubes had a heated filament as the source of electrons. Vacuum tubes generated a lot of heat and burnt out quite frequently. The ENIAC had around 18000 vacuum tubes.

But transistors had their problems too. Like other electronic components they needed to be soldered. The more complex the circuit became the more complicated became the connections between the individual transistors and the likelihood of faulty wiring increased. In 1958, Jack St. Clair Kilby of Texas Instruments manufactured the first integrated circuit or chip. A chip is a collection of tiny transistors that are connected together when the transistor is manufactured. Thus the need to solder large numbers of transistors was obviated. This not only saved space but increased the speed of the machine.

9. Commercial computers

In 1971, Intel released the first microprocessor. This was a special integrated circuit that was able to process four bits of data at a time. A company called Micro Instrumentation and Telemetry Systems (MITS) started marketing a kit called Altair 8800 for $397. Consumers could assemble the machine and program the machine by manually flipping switches located on the front of Altair. There was no software and the user had to write his own.

BASIC (Beginner's All Purpose Symbolic Instruction Code) was developed by Kemeny & Kurtz in 1964, two mathematicians at Dartmouth. Two programmers decided to write a BASIC interpreter for Altair. They contacted Ed Roberts, the owner of MITS who agreed to pay for it. The two programmers were William Gates and Paul Allen. They later went on to form Microsoft and produce BASIC and operating systems for various machines.

BASIC was not the only computer language around. There was FORTRAN that was developed in the 1950s by IBM programmers. FORTRAN was a high level language that allowed one to perform scientific computation easily. Another language that was developed at this time was ALGOL (Algorithmic Language) that was intended to be a universal, machine-independent language but was not that successful. A derivative of ALGOL, ALGOL-60 came to be known as C, the language of choice for systems programmers.

In the 1960s COBOL (Common Business Oriented Language) was developed to produce applications for the business world. Also, in the 60s, Niklaus Wirth, a Swiss computer scientist released Pascal as teaching language for beginning computer students. It forced programmers to develop a structured approach to programming. Niklaus Wirth later followed Pascal with Modula-II that was similar to Pascal in structure and syntax.

10. PC Explosion

There was an explosion of personal computers after the introduction of the Altair. Steve Jobs and Steve Wozniak introduced the Apple II. The Apple II had built-in BASIC, color graphics, and 4100 character memory for only $1298.

Tandy Radio Shack put the TRS-80 on the market in 1977. It later came out with TRS-80 Model II that had 64,000 character memory and a disk drive to store programs and data on. Personal computer applications took off as a floppy disk was one of the most convenient publishing medium for distribution of software.

IBM which was producing main frames and minicomputers came out with IBM PC, a small computer for the home market. It was modular and was built with many of its parts that were manufactured outside of IBM.

In 1984, Apple released the first generation Macintosh which was a computer to come with a graphical user interface and a mouse. It was easy to use and became a favorite with home users. IBM released 286-AT, which with applications like Lotus 1-2-3, a spreadsheet program, and Microsoft Word quickly captured the small business market. An account of the holy war between Microsoft and Apple and an enlightening social commentary read Neal Stephenson's article In the Beginning was the Command Line.