The technological development of the computer over the ages is often referred to in terms of the different generations of computing devices. The very first ones occupied a lot of space. So much so that special janitorial services or commercial cleaning services were hired for their maintenance. Mainly, a generation is the state of improvement in the product development process; basically a certain ‘leap’ in the computer technology. This fundamentally changes the way computers operate.
With each successive generation, the internal circuitry has become smaller, more advanced and more versatile than in the preceding generation. As a result of the miniaturization; speed, power, and computer memory has proportionally increased. New discoveries are in progress that affect the way we live, work and do other chores. Currently there are five known generations of computer.
The first generation computers relied on vacuum tubes for circuitry and magnetic drums for memory. These were enormous machines, taking up entire rooms. First generation computers relied on machine language, the lowest-level programming language understood by computers, to perform operations, and they could only solve one problem at a time.
Second-generation computers moved from cryptic binary machine language to symbolic, or assembly, languages, which allowed programmers to specify instructions in words. High-level programming languages were also being developed at this time, such as early versions of COBOL and FORTRAN. These were also the first computers that stored their instructions in their memory, which moved from a magnetic drum to magnetic core technology. The computers of this generation have an added value as they were developed initially for atomic industry.
The development of the integrated circuit was the hallmark of the third generation of computers. The need for such a development was necessary because although the use of transistors in place of vacuum tubes greatly reduced heat loss into the surrounding, there was still a considerable degree of heat loss that was damaging the internal components of the computer. Computers for the first time became accessible to a mass audience because they were smaller and cheaper than their predecessors.
The microprocessor brought the fourth generation of computers, as thousands of integrated circuits were built onto a single silicon chip. What in the first generation filled an entire room could now fit in the palm of the hand. In 1981 IBM introduced its first computer for the home user, and in 1984 Apple introduced the Macintosh. Microprocessors also moved out of the realm of desktop computers and into many areas of life as more and more everyday products began to use microprocessors.
As these small computers became more powerful, they could be linked together to form networks, which eventually led to the development of the Internet. Fourth generation computers also saw the development of GUI’s, the mouse and handheld devices.
Fifth generation computing devices, based on artificial intelligence, are still in development, though there are some applications, such as voice recognition, that are being used today. The use of parallel processing and superconductors is helping to make artificial intelligence a reality. Parallel processing involves a system harnessing the processing power of many CPU’s to work as one, as opposed to von Neumann’s single central processing unit design. Superconductor, an equally innovative invention allows flow of electricity with little or low resistance; greatly improving information flow and reducing heat loss.