• Lecture: generations of computers, main characteristics of computers of different generations. Generations of computers, main characteristics of computers of different generations

    Comparison options Computer generations
    first second third fourth
    Time period 1946 - 1959 1960 - 1969 1970 - 1979 since 1980
    Element base (for control unit, ALU) Electronic (or electric) lamps Semiconductors (transistors) Integrated circuits Large integrated circuits(BIS)
    Main type of computer Large Small (mini) Micro
    Basic input devices Remote control, punched card, punched tape input Added alphanumeric display and keyboard Alphanumeric display, keyboard Color graphic display, scanner, keyboard
    Main output devices Alphanumeric printing device (ADP), punched tape output Plotter, printer
    External memory Magnetic tapes, drums, punched tapes, punched cards Added magnetic disk Punched paper tapes, magnetic disk Magnetic and optical disks
    Key decisions in software Universal languages programming, translators Batch operating systems optimizing translators Interactive operating systems, structured languages programming Friendly software, network operating systems
    Computer operating mode Single program Batch Time sharing Personal work and network data processing
    Purpose of using a computer Scientific and technical calculations Technical and economic calculations Management and economic calculations Telecommunications, information services

    Table - Main characteristics of the computer different generations


    Generation

    1

    2

    3

    4

    Period, years

    1946 -1960

    1955-1970

    1965-1980

    1980-present vr.

    Element base

    Vacuum tubes

    Semiconductor diodes and transistors

    Integrated circuits

    Very Large Scale Integrated Circuits

    Architecture

    Von Neumann architecture

    Multiprogram mode

    Local networks COMPUTER, computing systems collective use

    Multiprocessor systems, personal computers, global networks

    Performance

    10 – 20 thousand op/s

    100-500 thousand op/s

    About 1 million op/s

    Tens and hundreds of millions op/s

    Software

    Machine languages

    Operating systems, algorithmic languages

    Operating systems, dialog systems, computer graphics systems

    Application packages, databases and knowledge, browsers

    External devices

    Input devices from punched tapes and punched cards,

    ATsPU, teleprinters, NML, NMB

    Video terminals, HDDs

    NGMD, modems, scanners, laser printers

    Application

    Calculation problems

    Engineering, scientific, economic tasks

    ACS, CAD, scientific - technical problems

    Management tasks, communications, creation of workstations, text processing, multimedia

    Examples

    ENIAC, UNIVAC (USA);
    BESM - 1,2, M-1, M-20 (USSR)

    IBM 701/709 (USA)
    BESM-4, M-220, Minsk, BESM-6 (USSR)

    IBM 360/370, PDP -11/20, Cray -1 (USA);
    EU 1050, 1066,
    Elbrus 1.2 (USSR)

    Cray T3 E, SGI (USA),
    PCs, servers, workstations from various manufacturers

    Over the course of 50 years, several generations of computers have appeared, replacing each other. The rapid development of VT around the world is determined only by advanced element base and architectural solutions.
    Since a computer is a system consisting of hardware and software, it is natural to understand a generation as computer models characterized by the same technological and software solutions(element base, logical architecture, software). Meanwhile, in a number of cases it turns out to be very difficult to classify VT by generation, because the line between them becomes more and more blurred from generation to generation.
    First generation.
    Element base - electronic tubes and relays; RAM was performed on triggers, later on ferrite cores. Reliability - low, a cooling system was required; Computers had significant dimensions. Performance - 5 - 30 thousand arithmetic op/s; Programming - in computer codes (machine code), later autocodes and assemblers appeared. Programming was carried out by a narrow circle of mathematicians, physicists, and electronics engineers. First generation computers were used mainly for scientific and technical calculations.

    Second generation.
    Semiconductor element base. Reliability and performance are significantly increased, dimensions and power consumption are reduced. Development of input/output facilities, external memory. A number of progressive architectural solutions and further development of programming technology - time sharing mode and multiprogramming mode (combining work central processor for processing data and input/output channels, as well as parallelizing operations for fetching commands and data from memory)
    Within the second generation, the differentiation of computers into small, medium and large began to clearly appear. The scope of application of computers to solve problems - planning, economic, production process management, etc. - has expanded significantly.
    Are being created automated systems management (ACS) of enterprises, entire industries and technological processes (APCS). The end of the 50s is characterized by the emergence of a number of problem-oriented programming languages high level(JWU): FORTRAN, ALGOL-60, etc. Software development was achieved in the creation of libraries standard programs on various languages programming and for various purposes, monitors and dispatchers for controlling computer operating modes, planning its resources, which laid the foundation for the concepts of next-generation operating systems.

    Third generation.
    Element base on integrated circuits (IC). A series of computer models appear that are software compatible from the bottom up and have increasing capabilities from model to model. The logical architecture of computers and their peripheral equipment, which significantly expanded the functionality and computing capabilities. Operating systems (OS) become part of a computer. Many tasks of managing memory, input/output devices and other resources began to be taken over by the OS or directly by the computer hardware. Software is becoming powerful: database management systems (DBMS), design automation systems (CAD) for various purposes are appearing, automated control systems and process control systems are being improved. Much attention is paid to the creation of application program packages (APP) for various purposes.
    Programming languages ​​and systems are developing. Examples: - series of IBM/360 models, USA, serial production-since 1964; -EU Computers, USSR and CMEA countries since 1972.
    Fourth generation.
    The element base is becoming large-scale (LSI) and ultra-large-scale (VLSI) integrated circuits. Computers were already designed for the efficient use of software (for example, UNIX-like computers, in the best possible way immersed in a UNIX software environment; Task-oriented Prolog machines artificial intelligence); modern nuclear power plants. Telecommunications information processing is rapidly developing by improving the quality of communication channels using satellite communications. National and transnational information and computer networks are being created, which make it possible to talk about the beginning of the computerization of human society as a whole.
    Further intellectualization of computer technology is determined by the creation of more developed human-computer interfaces, knowledge bases, expert systems, parallel programming systems, etc.
    The element base has made it possible to achieve great success in miniaturization, increasing the reliability and performance of computers. Micro- and mini-computers have appeared, surpassing the capabilities of medium-sized and large computers of the previous generation at a significantly lower cost. The production technology of VLSI-based processors accelerated the pace of computer production and made it possible to introduce computers to the broad masses of society. With the advent of a universal processor on a single chip (microprocessor Intel-4004, 1971), the era of the PC began.
    The first PC can be considered the Altair-8800, created on the basis of the Intel-8080, in 1974. E.Roberts. P. Allen and W. Gates created a translator from the popular Basic language, significantly increasing the intelligence of the first PC (later the famous company Microsoft Inc was founded). The face of the 4th generation is largely determined by the creation of supercomputers characterized by high performance (average speed 50 - 130 megaflops. 1 megaflops = 1 million operations per second with floating point) and non-traditional architecture (the principle of parallelization based on pipelined processing of commands) . Supercomputers are used in solving problems of mathematical physics, cosmology and astronomy, modeling complex systems etc. Since powerful computers play and will continue to play an important switching role in networks, network issues are often discussed together with questions on super-computers. Among the domestic developments of super-computers, one can name the Elbrus series machines, the PS-2000 and PS-3000 computing systems , containing up to 64 processors controlled by a common command stream, performance on a number of tasks was achieved on the order of 200 megaflops. At the same time, given the complexity of the development and implementation of modern supercomputer projects, which require intensive fundamental research in the field of computer science, electronic technologies, high production standards, and serious financial costs, it seems very unlikely that domestic supercomputers will be created in the foreseeable future, according to the main characteristics not inferior to the best foreign models.
    It should be noted that with the transition to IP technology for computer production, the defining emphasis of generations is increasingly shifting from the element base to other indicators: logical architecture, software, user interface, application areas, etc.
    Fifth generation.

    The textbook consists of two sections: theoretical and practical. The theoretical part of the textbook outlines the foundations of modern computer science as a complex scientific and technical discipline, including the study of the structure and general properties information and information processes, general principles construction of computing devices, issues of organization and functioning of information and computer networks are considered, computer security, the key concepts of algorithmization and programming, databases and DBMS are presented. To control the acquired theoretical knowledge, self-testing questions and tests are offered. The practical part covers algorithms for basic actions when working with word processor Microsoft Word, table editor Microsoft Excel, a program for creating Microsoft Power Point presentations, archiving programs and antivirus programs. To consolidate the completed practical course, at the end of each section it is proposed to complete independent work.

    Book:

    In accordance with the elemental base and level of software development, four real generations of computers are distinguished, a brief description of which is given in Table 1.

    Table 1



    The first generation computers had a low speed of several tens of thousands of ops/sec. As internal memory Ferrite cores were used.

    The main disadvantage of these computers is the mismatch between the performance of the internal memory and the ALU and control unit due to different element bases. The overall performance was determined by the slower component - the internal memory - and reduced the overall effect. Already in the first generation computers, attempts were made to eliminate this drawback by asynchronizing the operation of devices and introducing output buffering, when the transmitted information is “dumped” into the buffer, freeing the device for further work(principle of autonomy). Thus, its own memory was used to operate the I/O devices.

    A significant functional limitation of the first generation computer was its focus on executing arithmetic operations. When trying to adapt them to analysis tasks, they turned out to be ineffective.

    There were no programming languages ​​as such yet, and programmers used machine instructions or assemblers to code their algorithms. This complicated and delayed the programming process. By the end of the 50s, programming tools were undergoing fundamental changes: a transition was made to automation of programming using universal languages ​​and libraries of standard programs. The use of universal languages ​​led to the emergence of translators.

    The programs were executed task by task, i.e. the operator had to monitor the progress of solving the problem and, when the end was reached, initiate the execution of the next task.

    The beginning of the modern era of computer use in our country dates back to 1950, when at the Institute of Electrical Engineering of the Academy of Sciences of the Ukrainian SSR under the leadership of S.A. Lebedev created the first domestic computer called MESM - Small Electronic Calculating Machine. During the first stage of development of computer technology in our country, a number of computers were created: BESM, Strela, Ural, M-2.

    The second generation of computers is the transition to a transistor element base, the emergence of the first mini-computers.

    The principle of autonomy is further developed - it is already implemented at the level individual devices, which is expressed in their modular structure. I/O devices are equipped with their own control units (called controllers), which made it possible to free the central control unit from managing I/O operations.

    Improvement and reduction in the cost of computers led to a decrease in the specific cost of computer time and computing resources in the total cost of an automated solution to a data processing problem, while at the same time the costs of program development (i.e. programming) almost did not decrease, and in some cases tended to increase . Thus, there was a tendency towards efficient programming, which began to be implemented in the second generation of computers and is being developed to the present day.

    The development begins on the basis of libraries of standard programs of integrated systems that have the property of portability, i.e. functioning on a computer different brands. Most used software are allocated in the PPP to solve problems of a certain class.

    The technology for executing programs on a computer is being improved: special software tools are being created - system software.

    The purpose of creating system software is to speed up and simplify the processor's transition from one task to another. The first batch processing systems appeared, which simply automated the launch of one program after another and thereby increased the processor load factor. Batch processing systems were the prototype of modern operating systems; they became the first system programs, designed to control the computing process. During the implementation of batch processing systems, a formalized task control language was developed, with the help of which the programmer informed the system and the operator what work he wanted to perform on the computer. A set of several tasks, usually in the form of a deck of punched cards, is called a task package. This element is still alive: the so-called MS DOS batch (or command) files are nothing more than packages of tasks (the extension in their name bat is an abbreviation for the English word batch, which means package).

    Second-generation domestic computers include “Promin”, “Minsk”, “Hrazdan”, “Mir”.

    In the 70s, third-generation computers emerged and developed. In our country these are ES Computers, ASVT, SM Computers. This stage is the transition to an integrated element base and the creation of multi-machine systems, since it was no longer possible to achieve a significant increase in performance on the basis of a single computer. Therefore, computers of this generation were created on the basis of the principle of unification, which made it possible to integrate arbitrary computing systems in various fields of activity.

    Extension functionality Computers have increased the scope of their application, which has caused an increase in the volume of processed information and posed the task of storing data in special databases and maintaining them. This is how the first database management systems - DBMS - appeared.

    The forms of computer use have changed: the introduction of remote terminals (displays) has made it possible to widely and effectively introduce time-sharing mode and thereby bring the computer closer to the user and expand the range of tasks to be solved.

    Ensuring a time-sharing regime allowed new look operating systems that support multiprogramming. Multiprogramming is a way of organizing a computing process in which several programs are alternately executed on one processor. While one program is performing an I/O operation, the processor is not idle, as happened when sequential execution programs (single-program mode) and executes another program (multi-program mode). In this case, each program is loaded into its own section of internal memory, called a partition. Multiprogramming is aimed at creating for each individual user the illusion of sole use computer, therefore, such operating systems were interactive in nature, when the user solved his problems in the process of dialogue with the computer.

    Comparison options

    Computer generations

    fourth

    Time period

    Element base (for control unit, ALU)

    Electronic (or electric) lamps

    Semiconductors (transistors)

    Integrated circuits

    Large scale integrated circuits (LSI)

    Main type of computer

    Small (mini)

    Basic input devices

    Remote control, punched card, punched tape input

    Alphanumeric display, keyboard

    Color graphic display, scanner, keyboard

    Main output devices

    Alphanumeric printing device (ADP), punched tape output

    Plotter, printer

    External memory

    Magnetic tapes, drums, punched tapes, punched cards

    Punched paper tapes, magnetic disk

    Magnetic and optical disks

    Key software solutions

    Universal programming languages, translators

    Batch operating systems that optimize translators

    Interactive operating systems, structured programming languages

    Friendly software, network operating systems

    Computer operating mode

    Single program

    Batch

    Time sharing

    Personal work and network processing

    Purpose of using a computer

    Scientific and technical calculations

    Technical and economic calculations

    Management and economic calculations

    Telecommunications, information services

    Table - Main characteristics of computers of various generations

    Generation

    Period, years

    1980-present vr.

    Element base

    Vacuum tubes

    Semiconductor diodes and transistors

    Integrated circuits

    Very Large Scale Integrated Circuits

    Architecture

    Von Neumann architecture

    Multiprogram mode

    Local computer networks, shared computing systems

    Multiprocessor systems, personal computers, global networks

    Performance

    10 – 20 thousand op/s

    100-500 thousand op/s

    About 1 million op/s

    Tens and hundreds of millions op/s

    Software

    Machine languages

    Operating systems, algorithmic languages

    Operating systems, dialog systems, computer graphics systems

    Application packages, databases and knowledge, browsers

    External devices

    Input devices from punched tapes and punched cards,

    ATsPU, teleprinters, NML, NMB

    Video terminals, HDDs

    NGMD, modems, scanners, laser printers

    Application

    Calculation problems

    Engineering, scientific, economic tasks

    ACS, CAD, scientific and technical tasks

    Management tasks, communications, creation of workstations, text processing, multimedia

    Examples

    ENIAC, UNIVAC (USA);
    BESM - 1,2, M-1, M-20 (USSR)

    IBM 701/709 (USA)
    BESM-4, M-220, Minsk, BESM-6 (USSR)

    IBM 360/370, PDP -11/20, Cray -1 (USA);
    EU 1050, 1066,
    Elbrus 1.2 (USSR)

    Cray T3 E, SGI (USA),
    PCs, servers, workstations from various manufacturers

    Over the course of 50 years, several generations of computers have appeared, replacing each other. The rapid development of VT throughout the world is determined only by advanced element base and architectural solutions.
    Since a computer is a system consisting of hardware and software, it is natural to understand a generation as computer models characterized by the same technological and software solutions (element base, logical architecture, software). Meanwhile, in a number of cases it turns out to be very difficult to classify VT by generation, because the line between them becomes more and more blurred from generation to generation.
    First generation.
    The element base is electronic tubes and relays; RAM was performed on flip-flops, later on ferrite cores. Reliability is low, a cooling system was required; Computers had significant dimensions. Performance - 5 - 30 thousand arithmetic op/s; Programming - in computer codes (machine code), later autocodes and assemblers appeared. Programming was carried out by a narrow circle of mathematicians, physicists, and electronics engineers. First generation computers were used mainly for scientific and technical calculations.

    Second generation.
    Semiconductor element base. Reliability and performance are significantly increased, dimensions and power consumption are reduced. Development of input/output facilities and external memory. A number of progressive architectural solutions and further development of programming technology - time sharing mode and multiprogramming mode (combining the work of the central processor for data processing and input/output channels, as well as parallelization of operations for fetching commands and data from memory)
    Within the second generation, the differentiation of computers into small, medium and large began to clearly appear. The scope of application of computers to solve problems - planning, economic, production process management, etc. - has expanded significantly.
    Automated control systems (ACS) for enterprises, entire industries and technological processes (ACS) are being created. The end of the 50s is characterized by the emergence of a number of problem-oriented high-level programming languages ​​(HLP): FORTRAN, ALGOL-60, etc. Software development was achieved in the creation of libraries of standard programs in various programming languages ​​and for various purposes, monitors and dispatchers for controlling modes operation of a computer, planning its resources, which laid the foundation for the concepts of next-generation operating systems.

    Third generation.
    Element base on integrated circuits (IC). A series of computer models appear that are software compatible from the bottom up and have increasing capabilities from model to model. The logical architecture of computers and their peripheral equipment have become more complex, which has significantly expanded the functionality and computing capabilities. Operating systems (OS) become part of a computer. Many tasks of managing memory, input/output devices and other resources began to be taken over by the OS or directly by the computer hardware. Software is becoming powerful: database management systems (DBMS), design automation systems (CAD) for various purposes are appearing, automated control systems and process control systems are being improved. Much attention is paid to the creation of application program packages (APP) for various purposes.
    Languages ​​and programming systems are developing. Examples: - series of IBM/360 models, USA, serial production - since 1964; -EU Computers, USSR and CMEA countries since 1972.
    Fourth generation.
    The element base is becoming large-scale (LSI) and ultra-large-scale (VLSI) integrated circuits. Computers were already designed for the efficient use of software (for example, UNIX-like computers, best immersed in the UNIX software environment; Prolog machines focused on artificial intelligence tasks); modern nuclear power plants. Telecommunications information processing is rapidly developing by improving the quality of communication channels using satellite communications. National and transnational information and computer networks are being created, which make it possible to talk about the beginning of the computerization of human society as a whole.
    Further intellectualization of computer technology is determined by the creation of more developed human-computer interfaces, knowledge bases, expert systems, parallel programming systems, etc.
    The element base has made it possible to achieve great success in miniaturization, increasing the reliability and performance of computers. Micro- and mini-computers have appeared, surpassing the capabilities of medium-sized and large computers of the previous generation at a significantly lower cost. The production technology of VLSI-based processors accelerated the pace of computer production and made it possible to introduce computers to the broad masses of society. With the advent of a universal processor on a single chip (microprocessor Intel-4004, 1971), the era of the PC began.
    The first PC can be considered the Altair-8800, created on the basis of the Intel-8080, in 1974. E.Roberts. P. Allen and W. Gates created a translator from the popular Basic language, significantly increasing the intelligence of the first PC (they later founded the famous company Microsoft Inc). The face of the 4th generation is largely determined by the creation of supercomputers characterized by high performance (average speed 50 - 130 megaflops. 1 megaflops = 1 million operations per second with floating point) and non-traditional architecture (the principle of parallelization based on pipelined processing of commands) . Supercomputers are used in solving problems of mathematical physics, cosmology and astronomy, modeling complex systems, etc. Since powerful computers play and will continue to play an important switching role in networks, network issues are often discussed together with questions on supercomputers. Among domestic developments, supercomputers -Computers can be called the Elbrus series machines, the PS-2000 and PS-3000 computer systems, containing up to 64 processors controlled by a common command stream; performance on a number of tasks was achieved on the order of 200 megaflops. At the same time, given the complexity of the development and implementation of modern supercomputer projects, which require intensive fundamental research in the field of computer science, electronic technologies, high production standards, and serious financial costs, it seems very unlikely that domestic supercomputers will be created in the foreseeable future, according to the main characteristics not inferior to the best foreign models.
    It should be noted that with the transition to IP technology for computer production, the defining emphasis of generations is increasingly shifting from the element base to other indicators: logical architecture, software, user interface, application areas, etc.
    Fifth generation.
    Originates in the depths fourth generation and is largely determined by the results of the work of the Japanese Committee scientific research in the field of computers, published in 1981. According to this project, computers and computing systems of the fifth generation, in addition to high performance and reliability at a lower cost, fully provided by VLSI, etc. the latest technologies, must satisfy the following qualitatively new functional requirements:

    · ensure ease of use of computers by implementing voice input/output systems; interactive information processing using natural languages; learning capabilities, associative constructions and logical conclusions;

    · simplify the process of creating software by automating the synthesis of programs according to the specifications of the original requirements in natural languages

    · improve the basic characteristics and performance qualities of computers to meet various social objectives, improve the cost-benefit ratio, speed, lightness, and compactness of computers; ensure their diversity, high adaptability to applications and reliability in operation.

    Considering the complexity of the implementation of the tasks assigned to the fifth generation, it is quite possible to divide it into more visible and better felt stages, the first of which was largely implemented within the framework of the current fourth generation.

    We can distinguish \(5\) main generations of computers. But the division of computer technology into generations is very arbitrary.

    I generation of computers: computers designed in \(1946\)-\(1955\)

    1. Element base: electron vacuum tubes.
    2. Connection of elements: suspended installation with wires.
    3. Dimensions: The computer is made in the form of huge cabinets.

    These computers were huge, clunky, and too expensive machines for large corporations and governments to purchase.

    The lamps consumed a large amount of electricity and generated a lot of heat.
    4. Performance: \(10-20\) thousand operations per second.
    5. Operation: difficult due to frequent failure of electron vacuum tubes.
    6. Programming: machine codes. In this case, you need to know all the machine commands, binary representation, and computer architecture. Most of the people involved were mathematicians and programmers. Computer maintenance required high professionalism from the personnel.
    7. RAM: up to \(2\) KB.
    8. Data was entered and output using punched cards and punched tapes.

    II generation of computers: computers designed in \(1955\)-\(1965\)

    In \(1948\) John Bardeen, William Shockley, Walter Brattain invented the transistor, for the invention of the transistor they received the Nobel Prize in \(1956\)

    The \(1\) transistor replaced \(40\) electron tubes and was much cheaper and more reliable.

    In \(1958\) the M-20 machine was created, which performed \(20\) thousand operations per second - the most powerful computer \(50s\) in Europe.

    In \(1963\) a fellow at the Stanford Research Center Douglas Engelbart demonstrated the work of the first mouse.

    1. Element base: semiconductor elements (transistors, diodes).
    2. Connection of elements: printed circuit boards and wall mounting.

    3. Dimensions: The computer is made in the form of similar racks, slightly taller than human height, but a special computer room was required for placement.
    4. Performance: \(100-500\) thousand operations per second.
    5. Operation: computing centers with a special staff of service personnel, a new specialty has appeared - computer operator.
    6. Programming: in algorithmic languages, the emergence of the first operating systems.
    7. RAM: \(2-32\) KB.
    8. The principle of time sharing has been introduced - combining the operation of different devices in time.

    9. Disadvantage: software incompatibility.

    Already starting from the second generation, machines began to be divided into large, medium and small based on size, cost, and computing capabilities.

    Thus, small domestic cars of the second generation (“ Nairi", "Hrazdan", "Peace" etc.) were quite accessible to every university at the end of the 1960s, while the above-mentioned BESM-6 had professional indicators (and cost) \(2-3\) orders of magnitude higher.

    III generation of computers: computers designed in \(1965\)-\(1975\)

    In \(1958\) Jack Kilby and Robert Noyce, independently of each other, invent integrated circuit(IS).

    In \(1961\) the first integrated circuit made on a silicon wafer went on sale.

    In \(1965\) the production of the third generation family of machines IBM-360 (USA) began. Models had unified system commands and differed from each other in the amount of RAM and performance.

    In \(1967\) the production of BESM began - 6 (\(1\) million operations in \(1\) s) and "Elbrus" (\(10\) million operations in \(1\) s) .

    In 1969, IBM separated the concepts of hardware and software. The company began selling software separately from hardware, marking the beginning of the software industry.

    \(29\) October \(1969\) the work of the very first global military computer network ARPANet, connecting research laboratories throughout the United States.

    Pay attention!

    In \(1971\) the first microprocessor was created by the company Intel. On \(1\) The crystal formed \(2250\) transistors.

    1. Element base: integrated circuits.

    3. Dimensions: The computer is made in the form of similar racks.
    4. Performance: \(1-10\) million operations per second.
    5. Operation: computer centers, display classes, new specialty - system programmer.
    6. Programming: algorithmic languages, operating systems.
    7. RAM: \(64\) KB.

    As we moved from the first to the third generation, programming capabilities changed radically. Writing programs in machine code for first-generation machines (and a little simpler in Assembly) for most second-generation machines is an activity that the vast majority of modern programmers become familiar with when studying at a university.

    The emergence of high-level procedural languages ​​and translators from them was the first step towards a radical expansion of the circle of programmers. Scientists and engineers began to write programs themselves to solve their problems.

    Already in the third generation, large unified series of computers appeared. For large and medium-sized machines in the US, this is primarily the IBM 360/370 family. In the USSR, the \(70\)s and \(80\)s were the time of the creation of unified series: ES (unified system) of computers (large and medium-sized machines), SM (system of small) computers and " Electronics» ( series microcomputer).

    They were based on American prototypes from IBM and DEC (Digital Equipment Corporation). Dozens of computer models were created and released, differing in purpose and performance. Their production was practically discontinued in the early \(90\)s.

    IV generation of computers: computers designed from \(1975\) to the beginning of the \(90\)s

    In \(1975\) IBM was the first to begin industrial production of laser printers.

    In \(1976\) IBM creates the first inkjet printer.

    In \(1976\) the first personal computer was created.

    Steve Jobs and Steve Wozniak organized an enterprise for the production of personal computers " Apple», intended for a wide range of non-professional users. \(Apple 1\) was sold at a very interesting price - \(666.66\) dollars. In ten months, we managed to sell about two hundred sets.

    In \(1976\) the first floppy disk with a diameter of \(5.25\) inches appeared.

    In \(1982\) IBM began producing IBM computers RS with Intel processor 8088, which laid down the principles of open architecture, thanks to which each computer can be assembled as if from cubes, taking into account available funds and with the possibility of subsequent replacement of blocks and adding new ones.

    In \(1988\) the first worm virus was created to infect email.

    In \(1993\) the production of IBM PC computers with a Pentium processor began.

    1. Element base: large integrated circuits (LSI).
    2. Connection of elements: printed circuit boards.
    3. Dimensions: compact computers, laptops.
    4. Performance: \(10-100\) million operations per second.
    5. Operation: multi-processor and multi-machine systems, any computer users.
    6. Programming: databases and data banks.
    7. RAM: \(2-5\) MB.
    8. Telecommunication data processing, integration into computer networks.

    V generation of computers: developments since the \(90\)s of the twentieth century

    The element base is ultra-large-scale integrated circuits (VLSI) using optoelectronic principles (lasers, holography).

    After the creation of the EDSAC model in England in 1949, a powerful impetus was given to the development of general purpose computers, which stimulated the emergence of computer models that made up the first generation in a number of countries. Over the course of more than 40 years of development of computer technology (CT), several generations of computers have appeared, replacing each other.

    The first generation computers used vacuum tubes and relays as their elemental base; RAM was performed on flip-flops, later on ferrite cores; performance was, as a rule, in the range of 5-30 thousand arithmetic op/s; they were characterized by low reliability, required cooling systems and had significant dimensions. The programming process required considerable skill, good knowledge of computer architecture and its software capabilities. At the beginning of this stage, programming in computer codes (machine code) was used, then autocodes and assemblers appeared. As a rule, first-generation computers were used for scientific and technical calculations, and the programming process itself was more like an art, which was practiced by a very narrow circle of mathematicians, electrical engineers and physicists.

    EDSAC computer, 1949

    2nd generation computer

    The creation of the first transistor in the USA on July 1, 1948 did not foretell a new stage in the development of VT and was associated primarily with radio engineering. At first, it was more like a prototype of a new electronic device, requiring serious research and refinement. And already in 1951, William Shockley demonstrated the first reliable transistor. However, their cost was quite high (up to $8 apiece), and only after the development of silicon technology did their price drop sharply, helping to accelerate the process of miniaturization in electronics, which also affected VT.

    It is generally accepted that the second generation begins with the RCA-501 computer, which appeared in 1959 in the USA and was created on a semiconductor element base. Meanwhile, back in 1955, an onboard transistor computer was created for the ATLAS intercontinental ballistic missile. New element technology has made it possible to dramatically increase the reliability of the VT, reduce its dimensions and power consumption, and significantly increase productivity. This made it possible to create computers with greater logical capabilities and performance, which contributed to the expansion of the scope of computer applications to solve problems of economic planning, production process management, etc. Within the framework of the second generation, the differentiation of computers into small, medium and large is becoming more and more clear. The end of the 50s is characterized by the beginning of the stage of automation of programming, which led to the emergence of languages Fortran programming(1957), Algol-60, etc.

    3rd generation computer

    The third generation is associated with the advent of computers with an elemental base on integrated circuits (IC). In January 1959, Jack Kilby created the first IC, which was a thin germanium plate 1 cm long. To demonstrate the capabilities of integrated technology, Texas Instruments created an on-board computer for the US Air Force, containing 587 ICs, and a volume (40 cm3) 150 times smaller than a similar old-style computer. But the Kilby IC had a number of significant shortcomings, which were eliminated with the advent of Robert Noyce's planar ICs that same year. From that moment on, IC technology began its triumphal march, capturing more and more new sections of modern electronics and, first of all, computer technology.

    The software that ensures the functioning of the computer in various operating modes is becoming significantly more powerful. Developed database management systems (DBMS), design automation systems (CAD) are appearing; Much attention is paid to the creation of application program packages (APP) for various purposes. New ones are still emerging and developing. existing languages and programming systems.

    4th generation computer

    The design and technological basis of the 4th generation VT are large-scale (LSI) and ultra-large-scale (VLSI) integrated circuits, created respectively in the 70-80s. Such ICs already contain tens, hundreds of thousands and millions of transistors on one crystal (chip). At the same time, LSI technology was partially used in projects of the previous generation (IBM/360, ES Computer Series-2, etc.). The most important conceptual criterion by which 4th generation computers can be separated from 3rd generation computers is that the former were designed with the expectation of effectively using modern computers and simplifying the programming process for the problem programmer. In terms of hardware, they are characterized by extensive use of IC technology and high-speed storage devices. The most famous series of fourth-generation computers can be considered the IBM/370, which, unlike the equally well-known 3rd generation IBM/360 series, has a more developed command system and a wider use of microprogramming. In the older models of the 370 series, the device was implemented virtual memory, allowing the user to create the appearance of unlimited RAM resources.

    The phenomenon of the personal computer (PC) dates back to the creation in 1965 of the first minicomputer, the PDP-8, which emerged as a result of the universalization of a specialized microprocessor for controlling a nuclear reactor. The machine quickly gained popularity and became the first mass-produced computer of this class; in the early 70s, the number of cars exceeded 100 thousand units. Further important step there was a transition from mini- to micro-computers; this new structural level of VT began to take shape at the turn of the 70s, when the advent of LSI made it possible to create a universal processor on a single chip. The first microprocessor Intel-4004 was created in 1971 and contained 2250 elements, and the first universal microprocessor Intel-8080, which was the standard for microcomputer technology and created in 1974, already contained 4500 elements and served as the basis for the creation of the first PCs. In 1979, one of the most powerful and versatile 16-bit microprocessor Motorolla-68000 with 70,000 elements was released, and in 1981, Hewlett Packard's first 32-bit microprocessor with 450 thousand elements was released.

    PC Altair-8800

    The first PC can be considered the Altair-8800, created based on the Intel-8080 microprocessor in 1974 by Edward Roberts. The computer was mailed, cost only $397, and was expandable peripheral devices(only 256 bytes of RAM!!!). For the Altair-8800, Paul Allen and Bill Gates created a translator from the popular Basic language, significantly increasing the intelligence of the first PC (they later founded the now famous Microsoft Inc). Equipping a PC with a color monitor led to the creation of a competing PC model, the Z-2; a year after the appearance of the first Altair-8800 PC, more than 20 different companies and firms joined the PC production; The PC industry began to take shape (PC production itself, their sales, periodicals and non-periodic publications, exhibitions, conferences, etc.). And already in 1977, three models of the Apple-2 PC were put into mass production ( Apple company Computers), TRS-80 (Tandy Radio Shark) and PET (Commodore), of which Apple, which initially lags behind in the competition, soon becomes the leader in PC production (its Apple-2 model was a huge success). By 1980 Apple Corporation enters Wall Street with the largest share capital and annual revenue of $117 million.

    But already in 1981, IBM, in order to avoid losing the mass market, began producing its now widely known series of PCs, IBM PC/XT/AT and PS/2, which opened new era personal VT. The entry of the giant IBM into the arena of the PC industry puts PC production on an industrial basis, which makes it possible to solve a number of important issues for the user (standardization, unification, developed software, etc.), to which the company paid great attention already within the framework of the production of the IBM/360 series and IBM/370. We can reasonably believe that in the short period of time that passed from the debut of the Altair-8800 to the IBM PC, more people joined the VT than in the entire long period - from Babage’s Analytical Engine to the invention of the first ICs.

    The first computer that opened the supercomputer class itself can be considered the Amdahl 470V16 model, created in 1975 and compatible with the IBM series. Machine used effective principle parallelization based on pipeline processing of commands, and the element base used LSI technology. Currently, the class of supercomputers includes models with an average speed of at least 20 megaflops (1 megaflops = 1 million floating-point operations per second). The first model with such performance was the largely unique ILLIAC-IV computer, created in 1975 in the USA and having maximum performance about 50 megaflops. This model had a huge impact on the subsequent development of supercomputers with matrix architecture. A bright page in the history of supercomputers is associated with the Cray series of S. Cray, the first model of which, Cray-1, was created in 1976 and had a peak speed of 130 megaflops. The architecture of the model was based on the pipeline principle of vector and scalar data processing with an elemental base on VLSI. Exactly this model laid the foundation for the class of modern supercomputers. It should be noted that despite a number of interesting architectural solutions, the success of the model was achieved mainly due to successful technological solutions. Subsequent models Cray-2, Cray X-MP, Cray-3, Cray-4 brought the series performance to about 10 thousand megaflops, and the Cray MP model, using new architecture on 64 processors and an elemental base on new silicon chips, had a peak performance of about 50 gigaflops.

    Concluding the excursion into the history of modern military technology with one or another detail of its individual stages, several significant comments should be made. First of all, there is an increasingly smooth transition from one generation of computers to another, when the ideas of the new generation mature to one degree or another and are even implemented in the previous generation. This is especially noticeable during the transition to IC technology for the production of VT, when the defining emphasis of generations is increasingly shifting from the element base to other indicators: logical architecture, software, user interface, application areas, etc. The most diverse VT appears, the characteristics of which do not fit into traditional classification frameworks; one gets the impression that we are at the beginning of a kind of universalization of computer technology, when all its classes strive to level out their computing capabilities. Many elements of the fifth generation are, to one degree or another, characteristic today.