The world’s first computers were not very personal. These first business computers were room-sized monsters, with several refrigerator-sized machines linked together. Due to the enormity of these early computers, they were dubbed “mainframe computers.” The period of time in which they were developed is often referred to as “The Mainframe Era.” During the mainframe era, teams of engineers were required to maintain and keep computers operational.
In most cases, only large businesses, universities, and government agencies could afford computers. The primary work of these devices was to organize and store large volumes of data that were tedious to manage by hand. In fact, from the late 1950s through the 1960s, computers were seen as a way to more efficiently perform mathematical calculations. One of the first and most famous of these, the Electronic Numerical Integrator Analyzer and Computer (ENIAC), was built for the sole purpose of ballistics calculations for the U.S. military during World War II.
In the late 1960s, the Manufacturing Resources Planning (MRP) systems were introduced. This software, running on a mainframe computer, gave companies the ability to manage the manufacturing process, making it more efficient. From tracking inventory, to creating bills of materials, to scheduling production, the MRP systems (and later the MRP II systems) gave more businesses a reason to want to integrate computing into their processes. IBM became the dominant mainframe company. Continued improvement in software and the availability of cheaper hardware eventually brought mainframe computers (and their little sibling, the minicomputer) into most large businesses.
The mainframe computers of the 1950s and 1960s made it clear to business and universities that computers, while physically enormous, were also of enormous value and worth the investment put forth by these organizations. As the mainframe computer continued to be adopted by organizations, new technologies were making the idea of a smaller, more personal, computer a reality. The invention of the integrated circuit marked the first major revolution in computing. The integrated circuit made it possible for all of a computer's electrical components to reside on one silicon chip.
The invention of the microprocessor has been regarded as the most significant and everlasting of all modern computing developments. Prior to the microprocessor, a computer would need one integrated circuit for each of its functions; thus resulting in a still fairly large and slow machine. At the size of a thumbnail, a single microprocessor could run a computer’s programs, remember information, and manage data with no assistance from any other hardware component. This led the way for devices that were much smaller with much faster computation speeds.
In 1975 the first microcomputer, the Altair 8800, was announced on the cover of Popular Electronics. Its immediate popularity sparked the imagination of entrepreneurs everywhere, and there were quickly dozens of companies making these “personal computers.” Though at first just a niche product for computer hobbyists, improvements in usability and the availability of practical software led to growing sales.
One of the more prominent of these early personal computer makers was a company known as Apple Computer, headed by Steve Jobs and Steve Wozniak, with the hugely successful “Apple II.” Not wanting to be left out of the revolution, in 1981 IBM (teaming with a little company called Microsoft for their operating-system software) hurriedly released their own version of the personal computer, simply called the “PC.”
Businesses that had used IBM mainframes for years finally had the permission they needed to bring personal computers into their companies, and the IBM PC took off. Due to the IBM PC’s open architecture, it was easy for other companies to copy, or “clone” it. During the 1980s, many new computer companies sprang up, offering less expensive versions of the PC. This would ultimately drive prices down and spur innovation.
Microsoft further developed its Windows operating system and made the PC even easier to use. Common uses for the PC during this period included word processing, spreadsheets, and databases. Organizations now had a way — through technology — to efficiently manage information. Still, these early PCs, for the most part, stood alone as islands of innovation within the larger organization, as they had not yet been connected to any sort of network.
During the mid 1980s, as computers became more commonplace, businesses began to see the need to connect their computers together as a way to collaborate and share resources. This idea led to businesses developing computer networks. A computer network is a group of computers connected for the sole purpose of communication-sharing of data and resources. Initial networking architecture was referred to as client-server, a computer system in which a centralized computer provides data to connected computers over a network. Users would log into the centralized computer (the “server”) from their PC (the “client”) and join the local area network (LAN), a computer network that links computers within a building.
As a result of client server networks, software companies began developing applications that allowed multiple users to access the same data at the same time. This evolved into software applications for communicating, with the first real popular use of electronic mail (email) appearing at this time. Networking and data sharing all stayed within the confines of each business. Computers were now seen as tools to collaborate internally, within an organization. In fact, these networks of computers were becoming so powerful that they were replacing many of the functions once performed by the larger mainframe computers, and at a fraction of the cost.
Source: Derived from Chapters 1 and 2 of “Information Systems for Business and Beyond” by David T. Bourgeois. Some sections removed for brevity. https://www.saylor.org/site/textbooks/Information%20Systems%20for%20Business%20and%20Beyond/Textbook.html