The first computers. From ancient to modern. A brief history of the creation and development of computers

The emergence of computers and computer technology

For many centuries, people have been trying to create various devices to facilitate calculations. In the history of the development of computers and computer technologies, several important events stand out that became decisive in further evolution.

In the 40s XVII century B. Pascal invented a mechanical device with which it was possible to add numbers.

At the end of the 18th century. G. Leibniz created a mechanical device designed for adding and multiplying numbers.

In 1946, the first mainframe computers were invented. American scientists J. von Neumann, G. Goldstein and A. Berne published a work in which they presented the basic principles of creating a universal computer. Since the late 1940s. The first prototypes of such machines, conventionally called first-generation computers, began to appear. These computers were manufactured at vacuum tubes and lagged behind modern calculators in performance.

In the further development of computers, the following stages are distinguished:

1) second generation of computers - invention of transistors;

2) third generation of computers – creation of integrated circuits;

3) fourth generation of computers - the appearance of microprocessors (1971).

The first microprocessors were produced by the company Intel, which led to the emergence of a new generation of PCs. Due to the massive interest in such computers that has arisen in society, the company IBM(International Business Machines Corporation) developed new project on their creation, and the company Microsoft –software for this computer. The project ended in August 1981, and the new PC became known as the IBM PC.

The developed computer model became very popular and quickly displaced all the company’s previous models from the market. IBM over the next few years. With the invention of the IBM PC, the production of standard IBM PC-compatible computers, which make up the majority modern market PC.

In addition to IBM PC-compatible computers, there are other types of computers designed to solve problems of varying complexity in various spheres of human activity.

The development of microelectronics has led to the emergence of microminiature integrated electronic elements who replaced semiconductor diodes and transistors and became the basis for the development and use of PCs. These computers had a number of advantages: they were compact, easy to use and relatively cheap.

In 1971 the company Intel created the i4004 microprocessor, and in 1974 the i8080, which had a huge impact on the development of microprocessor technology. This company remains the leader in the market for the production of microprocessors for PCs to this day.



Initially, PCs were developed based on 8-bit microprocessors. One of the first manufacturers of computers with a 16-bit microprocessor was the company IBM, until the 1980s specialized in production mainframe computers. In 1981, it released the first PC that used the principle open architecture, which made it possible to change the configuration of the computer and improve its properties.

At the end of the 1970s. and others large companies Leading countries (USA, Japan, etc.) began developing PCs based on 16-bit microprocessors.

In 1984 appeared TIKMacintosh companies Apple company's competitor IBM. In the mid-1980s. computers based on 32-bit microprocessors were released. Currently, 64-bit systems are available.

Based on the type of values ​​of the main parameters and taking into account the application, the following groups of computer equipment are distinguished:

supercomputer is a unique super-efficient system used to solve the most complex tasks, for large calculations;

server – a computer that provides its own resources to other users; there are file servers, print servers, database servers, etc.;

personal computer – a computer designed for use in the office or at home. The user can configure, maintain and install software for this type of computer;

professional work station- a computer with enormous performance and intended for professional activities in a certain field. Most often it is supplied with additional equipment and specialized software;

laptop - a portable computer with computing power PC. It can function for some time without power from the electrical network;

a pocket PC (electronic organizer), no larger in size than a calculator, keyboard or keyboardless, similar in functionality to a laptop;

network PC – a computer for business use with a minimum set of external devices. Operation support and software installation are carried out centrally. It is also used to work in a computer network and to function offline;

terminal – a device used when working in offline mode. The terminal does not contain a processor for executing commands; it only performs operations of entering and transmitting user commands to another computer and returning the result to the user.

The market for modern computers and the number of machines produced are determined by market needs.

Human life in the twenty-first century is directly related to artificial intelligence. Knowledge of the main milestones in the creation of computers is an indicator of an educated person. The development of computers is usually divided into 5 stages - it is customary to talk about five generations.

1946-1954 - first generation computers

It is worth saying that the first generation of computers (electronic computers) was tube-based. Scientists at the University of Pennsylvania (USA) developed ENIAC - that was the name of the world's first computer. The day it was officially put into operation is 02/15/1946. When assembling the device, 18 thousand vacuum tubes were used. The computer, by today's standards, had a colossal area of ​​135 square meters, and the weight is 30 tons. Electricity needs were also high - 150 kW.

It is a well-known fact that this electronic machine directly to help solve the most difficult problems of creating an atomic bomb. The USSR was rapidly catching up and in December 1951, under the leadership and with the direct participation of Academician S.A. Lebedev, the fastest computer in Europe was presented to the world. She bore the abbreviation MESM (Small Electronic Calculating Machine). This device could perform from 8 to 10 thousand operations per second.

1954 - 1964 - second generation computers

The next step in development was the development of computers running on transistors. Transistors are devices made from semiconductor materials that allow you to control the current flowing in a circuit. The first known stable operating transistor was created in America in 1948 by a team of physicists and researchers Shockley and Bardin.

In terms of speed, electronic computers differed significantly from their predecessors - the speed reached hundreds of thousands of operations per second. Both sizes and consumption have decreased electrical energy became less. The scope of use has also increased significantly. This happened due to the rapid development of software. Our best computer, BESM-6, had a record speed of 1,000,000 operations per second. Developed in 1965 under the leadership of chief designer S. A. Lebedev.

1964 - 1971 - third generation computers

The main difference of this period is the beginning of the use of microcircuits with a low degree of integration. Using sophisticated technologies, scientists were able to place complex electronic circuits on a small semiconductor wafer, with an area of ​​less than 1 square centimeter. The invention of microcircuits was patented in 1958. Inventor: Jack Kilby. The use of this revolutionary invention made it possible to improve all parameters - the dimensions were reduced to approximately the size of a refrigerator, the performance increased, as well as reliability.

This stage in the development of computers is characterized by the use of a new storage device - a magnetic disk. The PDP-8 minicomputer was first introduced in 1965.

IN THE USSR similar versions appeared much later - in 1972 and were analogues of the models presented on the American market.

1971 - modern times - fourth generation computers

An innovation in fourth-generation computers is the application and use of microprocessors. Microprocessors are ALUs (arithmetic logic units) placed on a single chip and have a high degree of integration. This means that the chips begin to take up even less space. In other words, a microprocessor is a small brain that performs millions of operations per second according to the program embedded in it. Size, weight and power consumption have been reduced dramatically, and performance has reached record highs. And that's when Intel came into the game.

The first microprocessor was called Intel-4004 - the name of the first microprocessor assembled in 1971. It had a 4-bit capacity, but at that time it was a gigantic technological breakthrough. Two years later, Intel introduced the eight-bit Intel-8008 to the world; in 1975, the Altair-8800 was born - this is the first personal computer based on the Intel-8008.

It was the beginning of an era personal computers. The machine began to be used everywhere for completely different purposes. A year later, Apple entered the game. The project was a great success and Steve Jobs became one of the most famous and richest people on Earth.

The IBM PC becomes the undisputed standard of computers. It was released in 1981 with 1 megabyte of RAM.

It is noteworthy that on this moment IBM-compatible electronic computers occupy approximately ninety percent of the computers produced! Also, we can’t help but mention Pentium. The development of the first processor with an integrated coprocessor was successful in 1989. Now this trademark unquestioned authority in the development and application of microprocessors in the computer market.

If we talk about prospects, then this is, of course, the development and implementation of the latest technologies: ultra-large integrated circuits, magnetic-optical elements, even elements of artificial intelligence.

Self-learning electronic systems- this is the foreseeable future, called the fifth generation in the development of computers.

A person strives to erase the barrier in communication with a computer. Japan worked on this for a very long time and, unfortunately, unsuccessfully, but this is the topic of a completely different article. At the moment, all projects are only in development, but at the current pace of development, this is the near future. The present time is the time when history is made!

Share.

1. First generation of computers
The first generation of computers saw the light of day in 1942, when the first electronic digital computer. This invention belongs to the American physicist Atanasov. In 1943, Englishman Alan Turing develops "Colossus" - secret computer, designed to decipher intercepted messages from German troops. These computers ran on lamps and were the size of a room. In 1945, mathematician John von Neumann proved that a computer could efficiently perform any calculation using the appropriate program control without changing the hardware. This principle became the basic rule for future generations of high-speed digital computers.2. Second generation of computers
In 1947, engineers John Bardeen and Walter Brattain invented the transistor. They were quickly introduced into radio engineering and replaced the inconvenient and large vacuum tube. In the 60s XX century transistors became the elementary basis for second-generation computers. The performance of machines began to reach hundreds of thousands of operations per second. The volume of internal memory increased hundreds of times compared to first-generation computers. High-level programming languages ​​began to actively develop: Fortran, Algol, Cobol.
3. Third generation of computers
The transition to the third generation is associated with significant changes in computer architecture. Machines were already running on integrated circuits. It was possible to run several programs on one computer. The speed of many machines reached several million operations per second. Magnetic disks began to appear and input/output devices were widely used.
4. Fourth generation of computers.
Another revolutionary event in electronics occurred in 1971, when the American company Intel announced the creation of a microprocessor. By connecting microprocessors to I/O devices, external memory, got new type computers - microcomputers, 4th generation of computers. These computers were small, cheap, and used a color graphic display, manipulators, and keyboard.
In 1976, the first personal computer was created - Apple II. The first domestic personal computer is Agat (1985). Since 1980, the American company IBM has become a trendsetter in the computer market. In 1981, it released its first personal computer, PC, and formed another line in the development of 4th generation computers - supercomputers. Among domestic machines, Elbrus computers were classified as supercomputers. Fifth generation computers are machines of the near future. Their main quality should be a high intellectual level. In fifth-generation machines, voice input will be possible, voice communication, machine “vision” and “touch”. Much has already been done in this direction.

This article describes the main stages of computer development. The main directions of development of computer technologies and the reasons for their development are described.

The main stages of computer development

During the evolution of computer technology, hundreds of different computers. Many of them have long been forgotten, while others have had a significant influence on modern ideas. In this article, we will give a brief overview of some key historical moments to better understand how developers arrived at the concept of modern computers. We will consider only the main points of development, leaving many details outside the brackets. The computers that we will consider are presented in the table below.

The main stages in the history of computer development:

Year of issue Computer name Creator Notes
1834 Analytical Engine Babbage First attempt to build a digital computer
1936 Z1 Zus First relay Calculating machine
1943 COLOSSUS British government First electronic computer
1944 Mark I Aiken The first American multi-purpose computer
1946 ENIAC I Eckert/Mouchley The history of modern computers begins with this machine
1949 EDSAC Wilkes The first computer with programs stored in memory
1951 Whirlwind I MIT First real time computer
1952 IAS Von Neumann This design is used in most modern computers
1960 PDP-1 DEC First mini-computer (50 copies sold)
1961 1401 IBM A very popular small computer
1962 7094 IBM A very popular small computing machine
1963 B5000 Burroughs First machine designed for a high level language
1964 360 IBM First family of computers
1964 6600 CDC The first supercomputer for scientific calculations
1965 PDP-8 DEC First mass-market mini-computer (50,000 units sold)
1970 PDP-11 DEC These minicomputers dominated the computer market in the 70s
1974 8080 Intel The first universal 8-bit computer on a chip
1974 CRAY-1 Cray The first vector supercomputer
1978 VAX DEC First 32-bit superminicomputer
1981 IBM PC IBM The era of modern personal computers has begun
1981 Osbome-1 Osborne First laptop computer
1983 Lisa Apple The first PC with a graphical user interface
1985 386 Intel First 32-bit predecessor to the Pentium line
1985 MIPS MIPS First RISC computer
1987 SPARC Sun First RISC workstation based on the SPARC processor
1990 RS6000 IBM First superscalar computer
1992 Alpha DEC First 64-bit PC
1993 Newton Apple The first pocket computer

In total, 6 stages in the development of computers can be distinguished from history: the generation of mechanical computers, computers based on vacuum tubes (such as ENIAC), transistor computers (IBM 7094), the first computers based on integrated circuits (IBM 360), personal computers (lines with Intel CPUs) and so-called invisible computers.

Zero generation - mechanical computers (1642-1945)

The first person to create a calculating machine was the French scientist Blaise Pascal (1623-1662), after whom one of the programming languages ​​is named. Pascal designed this machine in 1642, when he was just 19 years old, for his father, a tax collector. It was a mechanical design with gears and a manual drive. Pascal's calculating machine could only perform addition and subtraction operations.

Thirty years later, the great German mathematician Gottfried Wilhelm Leibniz (1646-1716) built another mechanical machine that could perform multiplication and division in addition to addition and subtraction. In essence, Leibniz created something like a pocket calculator three centuries ago with four functions.

Another 150 years later, Cambridge University mathematics professor Charles Babbage (1792-1871), inventor of the speedometer, developed and constructed difference engine. This mechanical machine, which, like Pascal's machine, could only add and subtract, calculated tables of numbers for sea navigation. The machine was equipped with only one algorithm - the finite difference method using polynomials. This car had quite interesting way information output: the results were squeezed out with a steel stamp on a copper plate, which anticipated later input-output means - punched cards and CDs.

Although his device worked quite well, Babbage soon became bored with a machine that performed only one algorithm. He spent a lot of time, most of his family fortune and another £17,000 from the government, developing the Analytical Engine. The analytical engine had 4 components: a storage device (memory), a computing device, an input device (for reading punched cards), an output device (a punch and a printing device). The memory consisted of 1000 words of 50 decimal places; each of the words contained variables and results. The computing device received operands from memory, then performed addition, subtraction, multiplication, or division operations and returned the resulting result back to memory. Like the difference engine, this device was mechanical.

The advantage of the Analytical Engine was that it could perform different tasks. She read commands from punch cards and carried them out. Some commands told the machine to take 2 numbers from memory, transfer them to a computing device, perform an operation on them (for example, add), and send the result back to the storage device. Other commands checked the number and sometimes performed a branch operation depending on whether it was positive or negative. If punched cards with a different program were inserted into the reader, the machine performed a different set of operations. That is, unlike the difference analytical engine, it could perform several algorithms.

Since the Analytical Engine was programmed in rudimentary assembly language, it needed software. To create this software, Babbage hired a young woman, Ada Augusta Lovelace, daughter of the famous British poet Byron. Ada Lovelace was the world's first computer programmer. Named in her honor modern language programming - Ada.

Unfortunately, like many modern engineers, Babbage never debugged a computer. He needed thousands and thousands of gears, made with a precision that was unavailable in the 19th century. But Babbage's ideas were ahead of his era, and even today most modern computers are similar in design to the Analytical Engine. Therefore, it is fair to say that Babbage was the grandfather of the modern digital computer.

In the late 1930s, the German Konrad Zuse designed several automatic adding machines using electromagnetic relays. He failed to get cash from the government for its developments, because the war began. Zus knew nothing of Babbage's work; his machines were destroyed during the bombing of Berlin in 1944, so his work had no influence on future development computer equipment. However, he was one of the pioneers in this field.

A little later, calculating machines were designed in America. John Atanasoff's machine was extremely advanced for its time. It used binary arithmetic and information containers that were periodically updated to avoid data destruction. Modern dynamic memory(RAM) works on exactly the same principle. Unfortunately, this machine never became operational. In a sense, Atanasov was like Babbage - a dreamer who was not satisfied with the technology of his time.

George Stibbitz's computer actually worked, although it was more primitive than Atanasov's machine. Stibits demonstrated his machine at a conference at Dartmouth College in 1940. Attending this conference was John Mauchley, a then unremarkable professor of physics at the University of Pennsylvania. He later became very famous in the field of computer development.

While Zus, Stibits and Atanasov were developing automatic adding machines, young Howard Aiken at Harvard was persistently designing manual adding machines as part of his doctoral dissertation. After completing the study, Aiken realized the importance of automatic calculations. He went to the library, read about Babbage's work and decided to create the same computer from relays that Babbage had failed to create from gears.

Aiken's first computer, the Mark I, was completed in 1944. The computer had 72 words of 23 decimal places each and could execute any command in 6 seconds. Punched paper tape was used for input/output devices. By the time Aiken completed work on the Mark II computer, relay computers were already obsolete. The era of electronics has begun.

First generation - vacuum tubes (1945-1955)

The incentive to create an electronic computer was the Second World War. At the beginning of the war, German submarines destroyed British ships. The German admirals sent commands to the submarines by radio, and although the British could intercept these commands, the problem was that the radio messages were encoded using a device called ENIGMA, whose predecessor was designed by amateur inventor and former US President Thomas Jefferson.

At the beginning of the war, the British managed to acquire ENIGMA from the Poles, who, in turn, stole it from the Germans. However, in order to decipher the encrypted message, a huge amount of calculations was required, and they had to be carried out immediately after intercepting the radiogram. Therefore, the British government founded a secret laboratory to create an electronic computer called COLOSSUS. The famous British mathematician Alan Turing took part in the creation of this machine. COLOSSUS was already working in 1943, but since the British government had complete control over the project and treated it as a military secret for 30 years, COLOSSUS did not become the basis for further computer development. We mention it only because it was the world's first electronic digital computer.

The Second World War influenced the development of computer technology in the United States. The Army needed tables that were used in targeting heavy artillery. Hundreds of women were hired to do calculations on manual adding machines and fill in the fields of these tables (it was believed that women were more accurate in calculations than men). However, this process was time consuming and errors were common.

John Mauchley, who was familiar with the work of Atanasoff and Stibblits, realized that the army was interested in calculating machines. He demanded that the army finance work on the creation of an electronic computer. The requirement was met in 1943, and Mauchley and his student J. Presper Eckert began constructing an electronic computer, which they called ENIAC (Electronic Numerical Integrator and Computer). ENIAC consisted of 18,000 vacuum tubes and 1,500 relays, weighed 30 tons and consumed 140 kilowatts of electricity. The machine had 20 registers, each of which could hold a 10-bit decimal number. (A decimal register is a very small memory that can hold a number up to a certain maximum quantity digits, something like an odometer that remembers the mileage a car has traveled.) The ENIAC had 6,000 multi-channel switches installed and many cables running to the connectors.

Work on the machine was completed in 1946, when it was no longer needed - at least to achieve the original goals.

Since the war was over, Mauchley and Eckert were allowed to set up a school where they shared their work with fellow scientists. It was at this school that interest in creating large digital computers arose.

After the school appeared, other researchers took up the construction of electronic computers. The first working computer was EDSAC (1949). This machine was designed by Maurice Wilkes at the University of Cambridge. Next - JOHNIAC at the Rand Corporation, ILLIAC at the University of Illinois, MANIAC at the Los Alamos Laboratory and WEIZAC at the Weizmann Institute in Israel.

Eckert and Mauchley soon began work on the car EDVAC(Electronic Discrete Variable Computer - electronic discrete parametric machine). Unfortunately, the project folded when they left university to start a computer corporation in Philadelphia (there was no Silicon Valley at that time). After a series of mergers, this company became Unisys Corporation.

Eckert and Mauchley wanted to obtain a patent for the invention of a digital computer. After several years of litigation, it was decided that the patent was invalid, since Atanasov invented the digital computer, although he did not patent it.

While Eckert and Mauchley were working on the EDVAC machine, one of the ENIAC project participants, John von Neumann, went to the Institute for Advanced Studies in Princeton to construct his own version of the EDVAC, called IAS(Immediate Address Storage - memory with direct addressing). Von Neumann was a genius in the same fields as Leonardo da Vinci. He knew many languages, was an expert in physics and mathematics, and had a phenomenal memory: he remembered everything he had ever heard, seen or read. He could quote verbatim from memory the text of books he had read several years ago. When von Neumann became interested computers, he was already the most famous mathematician in the world.

Von Neumann soon realized that creating computers with big amount switches and cables takes a long time and is very tedious. He came to the idea that the program should be represented in the computer's memory in digital form, along with the data. He also noted that the decimal arithmetic used in the ENIAC machine, where each digit was represented by ten vacuum tubes A on and 9 off) should be replaced by parallel binary arithmetic. By the way, Atanasov came to a similar conclusion only a few years later.

The main project that von Neumann described at the beginning is now known as von Neumann computer. It was used in the EDSAC, the first memory-program machine, and even now, more than half a century later, is the basis of most modern digital computers. The idea itself and the IAS machine had a very big influence on further development computer technology, so it is worth briefly describing von Neumann's project. It is worth keeping in mind that although the project is associated with the name of von Neumann, other scientists took an active part in its development - in particular, Goldstein. The architecture of this machine is illustrated by the following figure:

The von Neumann machine consisted of five main parts: memory, arithmetic-logical unit, control unit, and input-output devices. The memory included 4096 words of 40 bits in size, a bit is 0 or 1. Each word contained either 2 instructions of 20 bits, or a signed integer of 40 bits. 8 bits indicated the instruction type, and the remaining 12 bits identified one of the 4096 words. The arithmetic unit and the control unit constituted the “brain center” of the computer. IN modern cars these blocks are combined in one chip, called the central processor (CPU).

Inside the arithmetic-logical unit there was a special internal 40-bit register, the so-called accumulator. A typical command would add a word from memory to the accumulator or store the contents of the accumulator in memory. This machine did not perform floating point arithmetic, since Von Neumann believed that any competent mathematician could keep floating point in his head.

Around the same time that Von Neumann was working on the IAS machine, MIT researchers were developing their Whirlwind I computer. Unlike the IAS, ENIAC, and other machines of the same type with long words, the Whirlwind I machine had 16-bit words and was intended for work in real time. This project led to Jay Forrester's invention of magnetic core memory and then the first mass-produced minicomputer.

At that time, IBM was a small company that produced punch cards and mechanical machines for sorting punched cards. Although IBM partially financed Aiken's project, it had no interest in computers and did not build the 701 computer until 1953, many years after Eckert and Mauchley's UNIVAC had become number one in the computer market.

The 701 had 2048 words of 36 bits, each word containing two instructions. The 701 became the first computer to lead the market for ten years. Three years later, the 704 computer appeared, which had 4 KB of magnetic core memory, 36-bit instructions and a floating-point processor. In 1958, IBM began work on the last vacuum tube computer, the 709, which was essentially a sophisticated version of the 704.

Second generation - transistors (1955-1965)

The transistor was invented by Bell Laboratories employees John Bardeen, Walter Brattain, and William Shockley, for which they received the Nobel Prize in Physics in 1956. Within ten years, transistors had revolutionized computer manufacturing, and by the end of the 50s, vacuum tube computers were already hopelessly outdated. The first computer using transistors was built in the laboratory of MIT (Massachusetts Technical Institute). It contained 16-bit words, like Whirlwind I. The computer was called TX-0(Transistorized experimental computer 0 - experimental transistor computer 0) and was intended only for testing the future TX-2 machine.

The TX-2 was not a big deal, but one of the lab's engineers, Kenneth Olsen, founded DEC (Digital Equipment Corporation) in 1957 to produce a mass-produced machine similar to the TX-2. 0. This machine, the PDP-1, did not appear until four years later, mainly because those who financed DEC considered computer production unprofitable. Therefore, DEC sold mainly small electronic circuit boards.

The PDP-1 computer appeared only in 1961. It had 4096 words of 18 bits and a speed of 200,000 commands per second. This performance was half that of the 7090, the transistor equivalent of the 709. The PDP-1 was the fastest computer in the world at the time. The PDP-1 cost $120,000, while the 7090 cost millions. DEC sold dozens of PDP-1 computers, and the computer industry was born.

One of the first machines, the PDP-1 model, was given to MIT, where it immediately attracted the attention of some young researchers who showed great promise. One of the innovations of the PDP-1 was a 512 x 512 pixel display on which dots could be drawn. Soon, MIT students compiled special program for the PDP-1 to play War of the Worlds, the world's first computer game.

A few years later, DEC developed the PDP-8, a 12-bit computer. The PDP-8 cost much less than the PDP-1 (A$6,000). The main innovation is the single bus (omnibus), shown in Fig. 1.5. Tire is a set of parallel connected wires for connecting computer components. This innovation radically differentiated the PDP-8 from the IAS. This structure has since been used in all computers. DEC sold 50,000 PDP-8 computers and became the leader in the minicomputer market.

As noted, with the invention of the transistor, IBM built a transistor version of the 709 - 7090, and later the 7094. This version had a cycle time of 2 microseconds and memory consisted of 32,536 words of 36 bits. 7090 and 7094 were the latest computers type ENIAC, but they were widely used for scientific calculations in the 60s of the last century.

IBM also produced 1401 computers for commercial computing. This machine could read and write magnetic tapes and punched cards and print the result as quickly as the 7094, but at a lower cost. It was not suitable for scientific calculations, but it was very convenient for keeping business records.

1401 had no registers and no fixed word length. The memory contained 4000 bytes of 8 bits (in later models the size increased to a then unimaginable 16,000 bytes). Each byte contained a 6-bit character, an administrative bit, and a bit to indicate the end of a word. The MOVE command, for example, has a source address and a destination address. This instruction moves bytes from the first address to the second until the end-of-word bit is set to 1.

In 1964, CDC (Control Data Corporation) released the 6600, which was almost an order of magnitude faster than the 7094. This computer for complex calculations was very popular, and CDC took off. The secret to such high performance was that inside the CPU (central processing unit) there was a machine with a high degree of parallelism. She had several functional devices for addition, multiplication and division, and they could all work simultaneously. In order for the machine to work quickly, it was necessary to compose good program, and with some effort, it was possible to make the machine execute 10 commands simultaneously.

The 6600 machine had several small computers built into it. The central processor, therefore, only counted numbers, and the remaining functions (controlling the operation of the machine, as well as input and output of information) were performed by small computers. Some of the operating principles of the 6600 device are also used in modern computers.

The developer of the 6600 computer, Seymour Cray, was a legendary figure, as was von Neumann. He dedicated his entire life to creating very powerful computers which are now called supercomputers. Among them are 6600, 7600 and Sgau-1. Seymour Cray is also the author of the famous "car buying algorithm": you go to the store closest to your house, point to the car closest to the door, and say, "I'll take that one." This algorithm allows you to spend a minimum of time on not very important things (buying cars) and allows you to spend most of your time on important things (developing supercomputers).

Another computer worth mentioning is the Burroughs B5000. The developers of the PDP-1, 7094 and 6600 machines focused only on the hardware, trying to reduce its cost (DEC) or make it work faster (IBM and CDC). The software has not changed. The B5000 manufacturers took a different route. They designed the machine with the intention of programming it in Algol 60 (the predecessor of C and Java), constructing Hardware so as to simplify the compiler's task. So the idea came up that when
When designing a computer, you also need to take software into account. But this idea was soon forgotten.

Third generation - integrated circuits (1965-1980)

The invention of the silicon integrated circuit in 1958 by Robert Noyce meant that dozens of transistors could be placed on one small chip. Integrated circuit computers were smaller size, worked faster and cost less than their transistor predecessors.

By 1964, IBM was the leader in the computer market, but there was one big problem: the 7094 and 1401 computers it produced were incompatible with each other. One of them was intended for complex calculations, it used binary arithmetic on 36-bit registers, the second used the decimal number system and words of different lengths. Many buyers had both of these computers and didn't like that they were completely incompatible.

When it came time to replace these two series of computers, IBM took the plunge. It released the System/360 line of transistor computers, which were intended for both scientific and commercial calculations. The System/360 line had many innovations. It was a whole family of computers for working with one language (assembly). Each new model was more capable than the previous one. The company was able to replace the 1401 with the 360 ​​(model 30) and the 7094 with the 360 ​​(model 75). The Model 75 was larger, faster, and more expensive, but programs written for one could be used in the other. In practice, programs written for the small model were executed by the large model without much difficulty. But in case of transferring software from big car there might not be enough memory for a small one. Still, the creation of such a line of computers was a great achievement. The idea of ​​creating families of computers soon became very popular, and within a few years most computer companies were releasing series of similar machines with varying prices and features. In table Below are some parameters of the first models from the 360 ​​family. We will talk about other models of this family further.

The first models of the IBM 360 series:

Options Model 30 Model 40 Model 50 Model 65
Relative performance 1 3,5 10 21
Cycle time (ns) 1000 625 500 250
Maximum memory capacity (bytes) 65536 262144 262144 524288
Number of bytes recalled from memory per cycle 1 2 4 16
Maximum number of data channels 3 3 4 6

Another innovation in 360 - multiprogramming. Several programs could be in the computer's memory at the same time, and while one program was waiting for the I/O process to end, another was running. As a result, processor resources were used more efficiently.

The 360 ​​computer was the first machine that could completely emulate the operation of other computers. The small models could emulate the 1401, and the larger ones could emulate the 7094, so programmers could leave their old programs unchanged and use them with the 360. Some 360 ​​models ran programs written for the 1401 much faster than the 1401 itself, so reworking programs became pointless .

The 360 ​​series computers could emulate other computers because they were built using microprogramming. It was necessary to write only three microprograms: one for the 360 ​​instruction set, another for the 1401 instruction set, and a third for the 7094 instruction set. The requirement for flexibility was one of the main reasons for using microprogramming.

Computer 360 managed to resolve the dilemma between binary and decimal systems radix: This computer had 16 32-bit registers for binary arithmetic, but the memory was made of bytes, like the 1401. The 360 ​​used the same instructions to move records of different sizes from one part of the memory to another, like the 1401.

The memory capacity of the 360 ​​was 2 24 bytes (16 MB). At that time, this amount of memory seemed enormous. The 360 ​​line was later replaced by the 370 line, then the 4300, 3080, 3090. All of these computers had a similar architecture. By the mid-1980s, 16 MB of memory was no longer enough, and IBM had to give up some of the compatibility to move to the 32-bit addressing required for 2.32 byte memory.

One might think that since the machines had 32-bit words and registers, they might as well have 32-bit addresses. But at that time, no one could even imagine a computer with a memory capacity of 16 MB. Blaming IBM for lack of foresight is the same as blaming modern personal computer manufacturers for having only 32-bit addresses. Perhaps in a few years the memory capacity of computers will be much larger than 4 GB, and then 32-bit addresses will not be enough.

The world of minicomputers took a big step forward in the third generation with the production of the PDP-11 line of computers, successors to the PDP-8 with 16-bit words. In many ways, the PDP-11 computer was the little brother of the 360, and the PDP-1 was the little brother of the 7094. Both the 360 ​​and the PDP-11 had registers, words, byte memory, and in both lines the computers had different prices And different functions. The PDP-1 was widely used, especially in universities, and DEC continued to lead the minicomputer maker.

Fourth generation - ultra-large-scale integrated circuits (1980-?)

Appearance very large scale integrated circuits (VLSI) in the 80s it made it possible to place first tens of thousands, then hundreds of thousands and finally millions of transistors on one board. This led to the creation of smaller and faster computers. Before the PDP-1, computers were so large and expensive that companies and universities had to have special departments ( computing centers). By the 1980s, prices had fallen so much that not only organizations, but also individuals, had the opportunity to purchase computers. The era of personal computers has begun.

Personal computers were required for completely different purposes than their predecessors. They have been used to process words, spreadsheets, and run applications with high level interactivity (for example, games) with which large computers couldn't cope.

The first personal computers were sold as kits. Each kit contained printed circuit board, a set of integrated circuits typically including an Intel 8080 circuit, some cables, a power supply, and sometimes an 8-inch floppy drive. The buyer had to assemble the computer himself from these parts. No software was included with the computer. The buyer had to write the software himself. Later, the CP/M operating system appeared, written by Gary Kildall for the Intel 8080. This operating system was placed on a floppy disk, it included a file management system and an interpreter for executing custom commands, which were typed from the keyboard.

Another personal computer, the Apple (and later the Apple II), was developed by Steve Jobs and Steve Wozniak. This computer became extremely popular among home users and schools, making Apple company serious player in the market.

Observing what other companies were doing, IBM, then the leader in the computer market, also decided to start producing personal computers. But instead of designing a computer based on individual components IBM from scratch, which would have taken too long, gave one of its employees, Philip Estridge, a large sum of money, ordered him to go somewhere away from the meddling bureaucrats of the company's headquarters in Armonk (New York), and not return until a functioning personal computer is created. Estridge opened a company quite far from the company headquarters (in Florida), took an Intel 8088 as a central processor and created a personal computer from disparate components. This computer (IBM PC) appeared in 1981 and became the most sold computer in history.

However, IBM did one thing that it later regretted. Instead of keeping the machine's design secret (or at least shielding itself with patents) as it usually did, the company published the complete designs, including all the circuitry, in a $49 book. This book was published so that other companies could produce replacement boards for the IBM PC, thereby increasing the compatibility and popularity of this computer. Unfortunately for IBM, once the IBM PC project became widely known, many companies began making clones PC and often sold them much cheaper than IBM (since all the components of the computer could be easily purchased). Thus began the rapid production of personal computers.

Although some companies (such as Commodore, Apple, and Atari) produced personal computers using their own processors rather than Intel's, the production potential of the IBM PC was so great that other companies had to struggle. Only a few of them managed to survive, and only because they specialized in narrow areas, for example, in the production of workstations or supercomputers.

The first version of the IBM PC was equipped with the MS-DOS operating system, which was produced by the then tiny Microsoft Corporation. IBM and Microsoft jointly developed the following MS-DOS operating system OS/2 system, characteristic feature which was graphic user interface (Graphical User Interface, GUI), similar to the interface Apple Macintosh. Meanwhile Microsoft company also developed its own operating room Windows system, which ran on MS-DOS, in case OS/2 was not in demand. OS/2 really was not in demand, and Microsoft successfully continued to release the Windows operating system, which caused a huge discord between IBM and Microsoft. The legend of how tiny Intel and even tinier Microsoft managed to topple IBM, one of the largest, richest and most powerful corporations in world history, is told in detail in business schools around the world.

The initial success of the 8088 processor was encouraging Intel company for its further improvements. Particularly noteworthy is the 386, released in 1985, the first of the Pentium line. Modern Pentium processors are much faster than the 386, but from an architectural standpoint they are simply more powerful versions of the 386.

In the mid-80s, the CISC (Complex Instruction Set Computer) was replaced by the RISC (Reduced Instruction Set Computer) computer. RISC commands were simpler and much faster. In the 1990s, superscalar processors appeared that could execute many instructions simultaneously, often out of the order in which they appear in the program.

Up until 1992, personal computers were 8-, 16-, and 32-bit. Then came DEC's revolutionary 64-bit Alpha, the ultimate RISC computer that far outperformed all other PCs. However, at that time the commercial success of this model turned out to be very modest - only a decade later did 64-bit machines gain popularity, and then only as professional servers.

Fifth generation - invisible computers

In 1981, the Japanese government announced its intentions to allocate $500 million to national companies to develop fifth-generation computers based on artificial intelligence technologies, which were supposed to supplant the “tight in the head” fourth-generation machines. Watching how Japanese companies quickly seize market positions in the most different areas industries - from cameras to stereos and televisions - American and European manufacturers rushed in a panic to demand similar subsidies and other support from their governments. However, despite the great noise, the Japanese project to develop fifth-generation computers ultimately showed its inconsistency and was neatly “pushed into the back drawer.” In a sense, this situation turned out to be close to the one that Babbage faced: the idea was so ahead of its time that there was no adequate technological basis for its implementation.

Nevertheless, what can be called the fifth generation of computers did materialize, but in a very unexpected way - computers began to rapidly shrink. Apple model Newton, which appeared in 1993, clearly proved that a computer could fit into a case the size of a cassette player. Newton's handwriting input seemed to complicate matters, but subsequently the user interface of similar machines, now called personal electronic secretaries(Personal Digital Assistants, PDA), or simply pocket computers, was improved and gained wide popularity. Many pocket computers today are no less powerful than conventional PCs of two or three years ago.

But even pocket computers did not become a truly revolutionary development. Much higher value attached to the so-called “invisible” computers - those that are built into household appliances, watches, bank cards and a huge number of other devices. Processors of this type provide extensive functionality and an equally wide range of application options at a very reasonable price. The question is whether it is possible to combine these microcircuits into one full generation (and there are
they are from the 1970s) remains controversial. The fact is that they expand the capabilities of household and other devices by an order of magnitude. Already influence invisible computers on the development of world industry is very large, and over the years it will increase. One of the features of this type of computer is that its hardware and software are often designed using co-development.

Conclusion

So, the first generation includes computers based on vacuum tubes (such as ENIAC), to the second - transistor machines ( IBM 7094), to the third - the first computers on integrated circuits ( IBM 360), the fourth - personal computers (CPU lines Intel). As for the fifth generation, it is no longer associated with a specific architecture, but with a paradigm shift. Computers of the future will be built into all imaginable and inconceivable devices and, due to this, will truly become invisible. They
will become firmly integrated into everyday life - they will open doors, turn on lamps, distribute money and perform thousands of other duties. This model, developed by Mark Weiser in the late period of his activity, was originally called widespread computerization, but nowadays the term “ pervasive computerization" This phenomenon promises to change the world no less radically than the industrial revolution.

Based on materials from the book “Computer Architecture” by E. Tannenbaum, 5th edition.

One of the greatest inventions of its time. Billions of people use computers in their daily lives around the world.

Over the decades, the computer has evolved from a very expensive and slow device to today's extremely smart machines with incredible processing power.

No single person is credited with inventing the computer; many believe that Konrad Zuse and his Z1 machine were the first in a long line of innovations that brought us the computer. Konrad Zuse was a German who gained fame for creating the first freely programmable mechanical computing device in 1936. Zuse's Z1 was created with an emphasis on 3 main elements that are still used in modern calculators. Later, Konrad Zuse created the Z2 and Z3.

The first Mark series computers were built at Harvard. MARK was created in 1944, and this computer was the size of a room, measuring 55 feet long and 8 feet high. MARK could perform a wide range of calculations. It became a successful invention and was used by the US Navy until 1959.

The ENIAC computer was one of the most important advances in computing. It was commissioned during World War II by the American military. This computer used vacuum tubes instead of electric motors and levers for fast calculations. Its speed was thousands of times faster than any other computing device at the time. This computer was huge and had a total cost of $500,000. ENIAC was in service until 1955.

RAM or Random Access Memory was introduced in 1964. The first RAM was a metal detecting plate placed next to a vacuum tube that detected differences in electrical charges. It was an easy way to store computer instructions.

There were many innovations in 1940. Manchester developed the telecommunications Research Establishment. It was the first computer to use a stored program, and it became operational in 1948. Manchester MARK I continued to live in 1951 and showed enormous progress.

UNIVAC was built by the creators of ENIAC. It was the fastest and most innovative computer capable of processing many calculations. It was a masterpiece of its time and was highly praised by the public.

IBM, the first personal computer widely used and available to people. The IBM 701 was the first general purpose computer developed by IBM. New computer language called "Fortran" was used in the new 704 model. The IBM 7090 was also a great success, and dominated the office computer for the next 20 years. In the late 1970s and 1980, IBM developed the personal computer known as the PC. IBM has had a huge influence on the computers used today.

With the growth of the personal computer market in the early and mid-1980s, many companies realized that graphical interfaces were more user-friendly. This led to the development operating system under named Windows, Microsoft. The first version was called Windows 1.0 and later came Windows 2.0 and 3.0. Microsoft is becoming more and more popular today.

Today, computers are extremely powerful and more affordable than ever. They have practically infiltrated every aspect of our lives. They are used as a powerful communication and trading tool. The future of computers is huge.