by Roberto Verzola
The micro/mainframe terminology is derived from the early debate in the computing industry between IBM Corp. and Intel Corp. IBM was the pioneer and leader in the computer industry of the 1960s, and faced only a few small competitors. The industry players were often described as “Snow White and the seven dwarfs”.
IBM’s core product was the mainframe computer, a large computer system that occupied a whole room or a big hall. The system generated so much heat that the room had to be cooled with powerful air-conditioners. The wirings for the different system components were so complicated that another floor had to be built a foot or more above the old floor of the computer room, so that all the wires could be hidden underneath. The system employed several hardware engineers, software engineers, programmers, data-entry personnel, and other support staff. Progress in the computer industry meant larger and more powerful mainframes. So that more people could use the computer, they connected it to communication lines, so that it could be accessed from remote sites using computer terminals. Using computer services was expensive. It was thought that making the system more powerful and larger made it more efficient and therefore brought down the cost of using it, which was metered in terms of computer-seconds on top of a monthly subscription fee. The goal was economies of scale in size. The bigger the computer, the cheaper the computer-time.
Intel challenged this mindset. It introduced a tiny microprocessor, putting an entire central processing unit (CPU), which is the heart of the computer, in a single integrated circuit, commonly called a chip. The tiny chip could be made in the millions. However, the power of Intel’s microprocessor was puny, compared to IBM’s mainframe. IBM’s engineers laughed at Intel’s product and derisively called it a “toy”. “Real men use mainframes,” they said.
However, Intel’s microprocessor chip could be manufactured in the millions, which made it very cheap. Initially, it cost several hundred dollars, compared IBM’s price in the tens of millions. Thus it found an initial market among doit-yourselfers, tinkerers, and experimentalists. As the market grew, a few more companies designed their own microprocessors to compete with Intel. The market kept growing, and the chips kept getting cheaper. The micro approach was also attaining its own economies of scale, not in size, but in quantity. Soon, the computers made out of these small chips became commodity items, which further enlarged the market, leading to even better economies of scale.
The rest is history.
Intel beat IBM, and the micro approach led to desktop computers, laptop computers, the Internet, smart phones, and other technologies that created a new kind of economy that is now described as an information economy.
The micro approach became a deep game changer not only in the computer industry, but in all other sectors of the economy, as cheap computing, cheap memories and cheap communications changed the way other industries did their work.
Photo by pressmaster/Freepik.com