“Software is eating the world.” This emphatic statement was made in 2011 by Marc Andreesen, cofounder of Silicon Valley Venture Capital firm, Andreesen Horowitz. Andreesen, an ace software engineer himself, back in 1993/4/5 as cofounder of pioneering internet company Netscape, led the development of its breakthrough browser, Netscape Navigator, which opened the internet to the masses by introducing crucial user-friendly features that seriously lowered the technical bar for getting around the web successfully.
The reason Andreesen made that statement was because unlike the pre-internet software giants like Microsoft, Oracle, and IBM who mainly sold software to other industries, the biggest post-internet software giants, the most notable being Facebook, Amazon, Netflix and Google, the so-called FANG, primarily invade other industries, competing with the traditional players in that industry, and transform the industry beyond recognition. Let’s look at each FANG member, starting with Amazon.
Amazon is a retail giant, selling virtually there is to be sold via the internet using innovative software. It didn’t originally build its software to sell to companies in other industries though it now has a hugely profitable side business doing that (Amazon Web Services) but only it realized that after taking care of its retail business, it had spare capacity that could be sold to whoever was interested in making use of computing resources available over the internet (i.e. Cloud Computing). So while Amazon is in the business of retail, it is at heart a software company, employing the same kind of tech talent the likes of Microsoft, Oracle and IBM employ.
The same goes for Google. It is essentially an advertising firm and it has utterly transformed that industry. It too has turned its spare capacity into a hugely profitable side business (Google Cloud Platform). Like Amazon, its core employee base is comprised of software engineers.
Netflix, in turn is essentially a video club, movie theater/production company, and TV station all rolled into one. Like Amazon, and Google, it has transformed those industries using bleeding-edge software engineering.
Facebook like Google, is essentially an advertising firm, though to most of us it feels like the world’s largest social club, which is understandable given that mark Zuckerberg took as his inspiration (or so we are led to believe) the social clubs at Harvard University. Like the other ANG members, its main employee base are hotshot software engineers.
Andreesen didn’t stop there. He mentioned Apple iTunes, Spotify and Pandora, software firms that are essentially music companies. He also pointed that, at the time, the fastest growing entertainment companies were videogame makers, again basically software companies. He also noted that the fastest growing telecom company at the time was Skype, yet another software company that was eventually purchased by Microsoft for $8.5 billion. It would be eclipsed a mere 3 years later by another “telecom but software at heart” company called WhatsApp purchased by Facebook for $19 billion. Andreesen even noted that software was eating much of the value chain of industries that primarily existed in the physical world, of which he gave specific examples like automobiles, airlines, agriculture, oil and gas, and logistics among many others.

Most importantly, Andreesen explained why all these changes were taking place. He noted that the computer revolution starting in 1940s/50s, the invention of the microprocessor in the 1960s/70s and the rise of the world wide web in the 1990s (in which you recall, he played a most crucial role), had all coalesced to make possible the technology required to transform industries through software on a global scale. Here in Nigeria, we have seen this most glaringly in retail (Jumia, Konga) and finance (Flutterwave, Paystack). We would do well to go back in time to unravel these revolutions.
We may have to further back than Andreesen. To 1843, during Britain’s Victorian age. A titled young lady of about 28 years, known as Countess Ada Lovelace, a woman with a fascination for both mathematics and poetry had just become a scientific collaborator of sorts to Sir Charles Babbage, the leading mathematician of his age (He occupied the same Chair at Cambridge that Isaac Newton had occupied when he was a Cambridge don). This had come about because some 9 years earlier in 1834, Babbage had conceived to build a large mechanical contraption he called the “Analytical Engine” that was supposed to be the world’s first “general purpose” computer, general purpose in the sense that it could be made to perform all manner of mathematical calculations. Babbage had previously in 1822, started building another mechanical contraption, which he dubbed the “Difference Engine”, which had he finished building would have been the world’s first computer. The Difference Engine was being built to only be able to tabulate logarithms, sines, cosines and tangents, but he abandoned this project once the idea of a general purpose computer struck his mind.
As a result of this collaboration, Ada Lovelace would make many seminal contributions to the field of computer science. First she would suggest that the Analytical Engine could made to manipulate other symbols apart from numbers like musical notes and artistic impressions. That was never achieved with the Analytical Engine because the project was never finished but that is something we take for granted in today’s digital computers. She would also devise what many people would consider the first published computer program and in the process of doing she was the first to describe how a computer could carry out a certain task repeatedly, a concept known as looping which has gone on to become a foundational construct in computer programming. It is for these latter two achievements that I felt compelled to include Ada Lovelace in this history of software.
Like I said earlier, Babbage never completed any of his machines, so mankind had to wait till the 1940s/50s for the first working computers. These were giant machines that could fill an entire room. I discuss them in a previous article. As a significant part of the development of this generation of computers took place during the World War 2 era, the computers were primarily used to calculate ballistics. Much of the programming needed for the calculations were done by women. One of the most influential was Admiral Grace Hopper who developed COBOL, the first programming language that could run on computers made by different manufacturers, but to do that she also came up with another invention, the compiler which makes it possible for a language like COBOL to be understood by computers made by different manufacturers.
The 1960s/70s brought the invention of the microprocessor. This enabled computers to be small enough to fit on a desk, thus microprocessors ushered in the Personal Computer (PC) revolution. Prior to the PC era while programming computers was always an activity that went hand in hand with the development of computer hardware, there did not exist a software industry as such because computers then were very expensive machines that only a few large organizations could afford and they tended to be programmed by the organizations that bought them.
The advent of the PC era made it possible for a software industry to come into existence because they were now cheap enough to be bought by many people. Nobody understood this better than Bill Gates. Born in 1955, he dropped out of Harvard in 1975 to found Microsoft in order to develop software for the emerging PC manufacturers. He made the deal of his life when Microsoft was contracted by IBM to build the operating system of its pioneer PC. Microsoft struck a shrewd deal enabling it to retain the right to sell the operating system to other manufacturers, knowing fully well that clones of the IBM PC were bound to spring up. They duly did and Gates rode this wave to become the king of the software industry and the richest man in the world for at least 13 consecutive years.

Another hugely important company in the PC era was Apple, founded by Steve Jobs and Steve Wozniak. Apple was different from Microsoft in some crucial ways. It had both a hardware and software division whereas Microsoft focused exclusively on software. It also chose to have its software and hardware tightly integrated so that you couldn’t have one without the other. Although this made for a better user experience, it was nowhere as profitable as Microsoft’s strategy. In either case, for PCs to really become popular, they needed what is referred to in the industry as a “killer application”, i.e. a piece of software so compelling that people will go out to buy PCs in droves. That killer application was the spreadsheet. The first one was developed by an entrepreneur by the name of Dan Bricklin. It was called VisiCalc and it was developed for Apple’s first mass market PC, the Apple II. VisiCalc drove the success of the Apple II. A competing spreadsheet program, Lotus 123 developed by Mitch Kapor, drove the sales of the IBM PC and its clones. It also quickly supplanted VisiCalc in the market but was itself eventually supplanted by Microsoft Excel in the early 1990s.
By the early 90s, software was a very established industry producing some of the world’s wealthiest people like Bill Gates and Oracle founder Larry Ellison. But another major change was afoot. As software had become a very profitable industry, it was standard practice for software firms to withhold the software instructions, called source code, that made their products run, making it impossible for customers who purchased their software from modifying the software to suit their particular needs. This didn’t sit well with some idealistic types and two of them, an American called Richard Stallman and a programmer from Finland called Linus Torvalds, independently set out to remedy this. Their efforts would later merge (by merge I mean the distinctive works they produced and not they themselves). Both set out to build an operating system whose source code would be freely available for modification and that would also be free in terms of price. They would however approach their respective projects from opposite directions. Stallman, who being 17 years older started first. He built an entire operating system except for the most fundamental part called the kernel. The kernel is the part that communicates directly with the computer hardware. Torvalds started by building a kernel and by the time he was done with it, he discovered Stallman’s operating system on the internet and merged his kernel with it. The resulting complete operating system came to be known as Linux.
Torvalds did another unusual thing. As he was building his kernel, he was releasing the source code regularly on the internet for other programmers to tinker. They would soon be making improvements and wholly new contributions were being added. This continued after he had merged his kernel with Stallman’s operating system and the result was that a new movement was born known as the open-source movement where far-flung programmers who didn’t know each other collaborated over the internet to build software and make it freely available over the internet. From the work of Stallman and Torvalds, the open-source movement has grown to become the center of the billion-dollar software industry. Even the biggest tech companies today regularly release open-source software sometimes along with their paid for offerings.
The open source movement wasn’t the only major development of the early 90s. At a European particle physics research center, a British programmer by the name of Tim Berners Lee was busy crafting what would become the World Wide Web. Originally built to enable researchers easily information, it, in a few years became the dominant platform for computing, supplanting the desktop where software built for the desktop was installed on the desktop. The Web was grafted onto the internet which had been in existence since the 1960s but was foreboding place for computer novices because it required a high level of technical mastery to navigate. The Web made it easy for ordinary computer users to surf the internet. Some of the key developments that made the web user-friendly were made by the person we started this article with…Marc Andreesen. So we have come full circle.
The software industry is still growing by leaps and bounds. The most interesting development today is of course the emergence of AI (which itself has a fairly long history). Whatever comes next, we can be sure of one thing…software will continue to eat the world.
Bibliography
- Andreesen Marc. 2011 ‘Software is Eating the World’ https://a16z.com/why-software-is-eating-the-world/
- Isaacson, Walter. 2014 Innovators: How a Group of Hackers, Geniuses and Geeks Created the Digital Revolution London: Simon & Schuster