That we are in the digital age is something that virtually everyone knows but something very few precisely understand why. If pressed, people are generally likely to say because of computers, internet etc. It would probably surprise them to be told that computers actually existed before the coming of the digital age. Machines like the Electronic Numerical Integrator and Computer (ENIAC) and others like it built in the 1940s and 1950s were computers but they were not digital.
They were analog computers. Analog because they use physical quantities such as voltage, electric current, pressure to represent and process data. In contrast, digital computers use binary numbers to represent data (that is digital computers represent data using lists consisting entirely of ones and zeroes) and the operations of binary arithmetic to process data.
What’s more, the analog computers of the yester decades look nothing like the digital computers of today. The analog computers were big, hulking machines, often filling an entire room, while some of today’s digital computers are small enough to fit on your laps (your laptop) or even your pocket (your smart phone, IPod or even your pocket calculator).
The story of the digital age is how computers went from being ugly giant monsters filling an entire room to sleek devices that were small enough to fit on your desk, then your laps, and then your pocket (And if the right breakthroughs are made in nanotechnology, computers might just become implantable/injectable in the future). That miniaturization was made possible because of some key inventions. Those inventions are, in chronological order, the transistor, the Integrated Circuit (IC) otherwise known as the microchip, and the microprocessor.
The primary reason analog computers were so big was because of a key component known as the vacuum tube. A vacuum tube is an electronic device that controls the flow of electric current within a vacuum. Vacuum tubes were physically big and many of them were required.
The ENIAC for instance required about 18,000 vacuum tubes. Moreover, vacuum tubes generated a lot of heat and so extensive cooling systems were required to cool analog computers further increasing their size. Needless to say, the early analog computers were very expensive and there only a few of them because only very large corporations could afford them.
For computers (and us) to enter the digital age, a fundamental technical breakthrough was needed to replace the vacuum tube with something much smaller but just as effective. The first major figure to articulate the vision of the kind of breakthrough needed was a professor at the California Institute of Technology (Caltech), by the name of Carver Mead. Mead did not come up with the breakthrough but he provided much of the theoretical understanding for the breakthrough to happen. The breakthrough itself, was the invention of the transistor.
The transistor was invented in 1947 at an organization known as Bell labs, which was the research arm of the then telecom monopoly American Telephone and Telegraph (AT&T). It was invented by a trio of scientists who went by the names of William Shockley, Walter Brattain and John Bardeen (actually it was invented by just Brattain and Bardeen but Shockley was their boss and he eventually came up with some elegant improvements to the original design).
The transistor was a tiny device that could amplify an electric current, and switch it on or off. This switching, is what enables a computer to compute. The “on” and “off” states represent the ones and zeroes described earlier. In other words, the transistors carry out the binary arithmetic operations of the computer described earlier.
The more transistors there are, the faster the computer is able to carry out binary arithmetic. So the primary task of the digital age has been to constantly shrink the size of transistors and cram evermore of them into computers, thus giving us computers that get smaller but are yet more powerful than their predecessors.
After the invention of the transistor, much work was done in the following decade to continue to miniaturize them. By 1958, the possibility of having millions of them etched on a single tiny chip the size of a short toothpick became feasible, and so it happened. This was the next major innovation of the digital age after the transistor and it was dubbed the Integrated Circuit, also known as the “microchip”. The microchip would enable computers to become even smaller.
The microchip was the work of two men working independently, each unaware of the other’s work. They would each come up with their first microchip roughly within 6 months of each other. The first was named Jack Kilby who worked at a company called Texas Instruments and the second was called Robert Noyce who worked at a very seminal company in Silicon Valley and of the digital age in general called Fairchild Semiconductor. Kilby would get there first but Noyce’s design was a lot more elegant and thus became the industry standard for microchips.
In 1968, Noyce would begin to tire of Fairchild and was seeking an exit. Backed by venture capital and together with two colleagues named Gordon Moore and Andy Grove, he would start a company called Integrated Electronics Corp which was eventually shortened to Intel.
Now up to that point, microchips were built to carry out one, specific computing application. In 1969, and early Intel employee named Ted Hoff would conceive an idea that would radically reinvent the industry yet again. It occurred to him while working on a hand-held calculator project that required him to build 12 microchips, each for the specific function, that he could build one general, all-purpose chip that could be programmed to carry out all the various functions of the hand-held calculator. This was essentially a “computer-on-a-chip”, and thus it was named the microprocessor.
The microprocessor would soon make its way into a variety of electronic devices and mostly importantly, it would again make computers smaller and become the personal items that now sit on our desks and laps. Intel would dominate the microprocessor industry for decades.
Fast-forward to today and yet again, major changes are afoot. With AI being all the rage, there has emerged a huge demand for specialized chips that can handle AI workloads. Intel has seen itself slip behind competitors like Nvidia in this space. Nvidia recently achieved a trillion-dollar stock market valuation on account of its Graphic Processing Units (GPUs).
These are special chips used for rendering graphics in video games and animated movies. It since been accidentally discovered that they are also highly suitable for AI workloads. Google has built chips specifically targeting AI known as tensor Processing Units (TPUs). One thing that we can be sure of is that microprocessors will continue to evolve to fulfill the need of exciting new markets.
Bibliography
- Gilder, George. 1990 Microcosm: A Prescient Look Inside the Expanding Universe of Economic, Social, and Technological Possibilities Within the World of the Silicon Chip New York: Touchstone
- Isaacson, Walter. 2015 The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution London: Simon and Schuster

