The history of the creation of computers of different generations

One of the first devices (V-IV centuries BC), from which the history of the development of computers can be considered to have begun, was a special board, later called the “abacus”. Calculations on it were carried out by moving bones or stones in the recesses of boards made of bronze, stone, ivory and the like. In Greece, the abacus existed already in the 5th century. BC, the Japanese called it “serobayan”, the Chinese called it “suanpan”. In Ancient Rus', a device similar to an abacus was used for counting - a “plank counting”. In the 17th century, this device took the form of the usual Russian abacus.

Abacus (V-IV centuries BC)

The French mathematician and philosopher Blaise Pascal created the first machine in 1642, which received the name Pascalina in honor of its creator. A mechanical device in the form of a box with many gears, in addition to addition, also performed subtraction. Data was entered into the machine by turning dials that corresponded to numbers from 0 to 9. The answer appeared at the top of the metal case.


Pascalina

In 1673, Gottfried Wilhelm Leibniz created a mechanical calculating device (Leibniz calculator - Leibniz calculator), which for the first time not only added and subtracted, but also multiplied, divided and calculated the square root. Subsequently, the Leibniz wheel became the prototype for mass calculating instruments - adding machines.


Leibniz step calculator model

English mathematician Charles Babbage developed a device that not only performed arithmetic operations, but also immediately printed the results. In 1832, a tenfold smaller model was built from two thousand brass parts, which weighed three tons, but was capable of performing arithmetic operations accurate to the sixth decimal place and calculating second-order derivatives. This computer became the prototype of real computers; it was called a differential machine.

Differential machine

A summing apparatus with continuous transmission of tens is created by the Russian mathematician and mechanic Pafnuty Lvovich Chebyshev. This device achieves automation of all arithmetic operations. In 1881, an attachment to the adding machine for multiplication and division was created. The principle of continuous transmission of tens has been widely used in various counters and computers.


Chebyshev summing apparatus

Automated data processing appeared at the end of the last century in the USA. Herman Hollerith created a device - the Hollerith Tabulator - in which information printed on punched cards was deciphered by electric current.

Hollerith tabulator

In 1936, a young Cambridge scientist, Alan Turing, came up with a mental calculating machine that existed only on paper. His “smart machine” operated according to a specific algorithm. Depending on the algorithm, the imaginary machine could be used for a wide variety of purposes. However, at that time these were purely theoretical considerations and schemes that served as the prototype of a programmable computer, as a computing device that processes data in accordance with a certain sequence of commands.

Information revolutions in history

In the history of the development of civilization, several information revolutions have occurred - transformations of social public relations due to changes in the field of processing, storing and transmitting information.

First The revolution is associated with the invention of writing, which led to a gigantic qualitative and quantitative leap in civilization. There is an opportunity to transfer knowledge from generation to generation.

Second(mid-16th century) revolution was caused by the invention of printing, which radically changed industrial society, culture, and organization of activities.

Third(end of the 19th century) revolution with discoveries in the field of electricity, thanks to which the telegraph, telephone, radio, and devices appeared that make it possible to quickly transmit and accumulate information in any volume.

Fourth(since the seventies of the 20th century) the revolution is associated with the invention of microprocessor technology and the advent of the personal computer. Computers and data transmission systems (information communications) are created using microprocessors and integrated circuits.

This period is characterized by three fundamental innovations:

  • transition from mechanical and electrical means of information conversion to electronic ones;
  • miniaturization of all components, devices, instruments, machines;
  • creation of software-controlled devices and processes.

History of the development of computer technology

The need for storing, converting and transmitting information appeared in humans much earlier than the creation of the telegraph apparatus, the first telephone exchange and the electronic computer (computer). In fact, all the experience, all the knowledge accumulated by humanity, one way or another, contributed to the emergence of computer technology. The history of the creation of computers - the general name for electronic machines for performing calculations - begins far in the past and is associated with the development of almost all aspects of human life and activity. As long as human civilization has existed, certain automation of calculations has been used for as long.

The history of the development of computer technology goes back about five decades. During this time, several generations of computers have changed. Each subsequent generation was distinguished by new elements (electron tubes, transistors, integrated circuits), the manufacturing technology of which was fundamentally different. Currently, there is a generally accepted classification of computer generations:

  • First generation (1946 - early 50s). The element base is electron tubes. Computers were distinguished by their large dimensions, high energy consumption, low speed, low reliability, and programming in codes.
  • Second generation (late 50s - early 60s). Element base - semiconductor. Almost all technical characteristics have improved compared to the previous generation computers. Algorithmic languages ​​are used for programming.
  • 3rd generation (late 60s - late 70s). The element base is integrated circuits, multilayer printed circuit assembly. A sharp reduction in the size of computers, increasing their reliability, increasing productivity. Access from remote terminals.
  • Fourth generation (from the mid-70s to the end of the 80s). The element base is microprocessors, large integrated circuits. Technical characteristics have been improved. Mass production of personal computers. Directions of development: powerful multiprocessor computing systems with high performance, creation of cheap microcomputers.
  • Fifth generation (from the mid-80s). The development of intelligent computers began, but has not yet been successful. Introduction into all areas of computer networks and their integration, use of distributed data processing, widespread use of computer information technologies.

Along with the change of generations of computers, the nature of their use also changed. If at first they were created and used mainly to solve computational problems, then later the scope of their application expanded. This includes information processing, automation of control of production, technological and scientific processes, and much more.

Principles of operation of computers by Konrad Zuse

The idea of ​​​​the possibility of building an automated calculating apparatus came to the mind of the German engineer Konrad Zuse, and in 1934 Zuse formulated the basic principles on which future computers should work:

  • binary number system;
  • use of devices operating on the “yes/no” principle (logical 1/0);
  • fully automated process of the computer;
  • software control of the calculation process;
  • support for floating point arithmetic;
  • using large capacity memory.

Zuse was the first in the world to determine that data processing begins with a bit (he called the bit “yes/no status”, and the formulas of binary algebra - conditional propositions), the first to introduce the term “machine word” (Word), the first to combine arithmetic and logical calculators operation, noting that “the elementary operation of a computer is testing two binary numbers for equality. The result will also be a binary number with two values ​​(equal, not equal).”

First generation - computers with vacuum tubes

Colossus I is the first tube-based computer, created by the British in 1943 to decipher German military ciphers; it consisted of 1,800 vacuum tubes—devices for storing information—and was one of the first programmable electronic digital computers.

ENIAC - was created to calculate artillery ballistics tables; this computer weighed 30 tons, occupied 1000 square feet and consumed 130-140 kW of electricity. The computer contained 17,468 vacuum tubes of sixteen types, 7,200 crystal diodes and 4,100 magnetic elements, and they were contained in cabinets with a total volume of about 100 m 3. ENIAC had a performance of 5000 operations per second. The total cost of the machine was $750,000. Electricity consumption was 174 kW, and the total space occupied was 300 m2.


ENIAC - a device for calculating artillery ballistics tables

Another representative of the 1st generation of computers that you should pay attention to is EDVAC (Electronic Discrete Variable Computer). EDVAC is interesting because it attempted to record programs electronically in so-called “ultrasonic delay lines” using mercury tubes. In 126 such lines it was possible to store 1024 lines of four-digit binary numbers. It was a “fast” memory. As a “slow” memory, it was supposed to record numbers and commands on a magnetic wire, but this method turned out to be unreliable, and it was necessary to return to teletype tapes. EDVAC was faster than its predecessor, adding in 1 µs and dividing in 3 µs. It contained only 3.5 thousand electronic tubes and was located on 13 m 2 of area.

UNIVAC (Universal Automatic Computer) was an electronic device with programs stored in memory, which were entered there not from punched cards, but using magnetic tape; this ensured high speed of reading and writing information, and, consequently, higher performance of the machine as a whole. One tape could contain a million characters, written in binary form. Tapes could store both programs and intermediate data.


Representatives of the first generation of computers: 1) Electronic Discrete Variable Computer; 2) Universal Automatic Computer

The second generation is a computer with transistors.

Transistors replaced vacuum tubes in the early 60s. Transistors (which act like electrical switches) consume less power and generate less heat and take up less space. Combining several transistor circuits on one board produces an integrated circuit (chip, literally, plate). Transistors are binary number counters. These parts record two states - the presence of current and the absence of current, and thereby process the information presented to them in exactly this binary form.

In 1953, William Shockley invented the p-n junction transistor. The transistor replaces the vacuum tube and at the same time operates at a higher speed, produces very little heat and consumes almost no electricity. Simultaneously with the process of replacing electronic tubes with transistors, methods of storing information were improved: magnetic cores and magnetic drums began to be used as memory devices, and already in the 60s, storing information on disks became widespread.

One of the first transistor computers, the Atlas Guidance Computer, was launched in 1957 and was used to control the launch of the Atlas rocket.

Created in 1957, the RAMAC was a low-cost computer with modular external disk memory, a combination of magnetic core and drum random access memory. And although this computer was not yet completely transistorized, it was distinguished by high performance and ease of maintenance and was in great demand in the office automation market. Therefore, a “large” RAMAC (IBM-305) was urgently released for corporate customers; to accommodate 5 MB of data, the RAMAC system needed 50 disks with a diameter of 24 inches. The information system created on the basis of this model flawlessly processed arrays of requests in 10 languages.

In 1959, IBM created its first all-transistor large mainframe computer, the 7090, capable of 229,000 operations per second—a true transistorized mainframe. In 1964, based on two 7090 mainframes, the American airline SABRE first used an automated system for selling and booking air tickets in 65 cities around the world.

In 1960, DEC introduced the world's first minicomputer, the PDP-1 (Programmed Data Processor), a computer with a monitor and keyboard that became one of the most notable phenomena on the market. This computer was capable of performing 100,000 operations per second. The machine itself occupied only 1.5 m 2 on the floor. The PDP-1 became, in fact, the world's first gaming platform thanks to MIT student Steve Russell, who wrote a Star War computer toy for it!


Representatives of the second generation of computers: 1) RAMAC; 2) PDP-1

In 1968, Digital launched the first serial production of minicomputers - it was the PDP-8: their price was about $10,000, and the model was the size of a refrigerator. This particular PDP-8 model was able to be purchased by laboratories, universities and small businesses.

Domestic computers of that time can be characterized as follows: in terms of architectural, circuit and functional solutions, they corresponded to their time, but their capabilities were limited due to the imperfection of the production and element base. The most popular machines were the BESM series. Serial production, quite insignificant, began with the release of the Ural-2 computer (1958), BESM-2, Minsk-1 and Ural-3 (all - 1959). In 1960, the M-20 and Ural-4 series went into production. The maximum performance at the end of 1960 was “M-20” (4500 lamps, 35 thousand semiconductor diodes, memory with 4096 cells) - 20 thousand operations per second. The first computers based on semiconductor elements (“Razdan-2”, “Minsk-2”, “M-220” and “Dnepr”) were still in the development stage.

Third generation - small-sized computers based on integrated circuits

In the 50s and 60s, assembling electronic equipment was a labor-intensive process that was slowed by the increasing complexity of electronic circuits. For example, a computer type CD1604 (1960, Control Data Corp.) contained about 100 thousand diodes and 25 thousand transistors.

In 1959, Americans Jack St. Clair Kilby (Texas Instruments) and Robert N. Noyce (Fairchild Semiconductor) independently invented an integrated circuit (IC) - a collection of thousands of transistors placed on a single silicon chip inside a microcircuit.

The production of computers using ICs (they were later called microcircuits) was much cheaper than using transistors. Thanks to this, many organizations were able to purchase and use such machines. And this, in turn, led to an increase in demand for general-purpose computers designed to solve various problems. During these years, computer production acquired an industrial scale.

At the same time, semiconductor memory appeared, which is still used in personal computers to this day.


Representative of the third generation of computers - ES-1022

Fourth generation - personal computers based on processors

The forerunners of the IBM PC were the Apple II, Radio Shack TRS-80, Atari 400 and 800, Commodore 64 and Commodore PET.

The birth of personal computers (PC) is rightfully associated with Intel processors. The corporation was founded in mid-June 1968. Since then, Intel has grown into the world's largest manufacturer of microprocessors with more than 64 thousand employees. Intel's goal was to create semiconductor memory and, in order to survive, the company began to take third-party orders for the development of semiconductor devices.

In 1971, Intel received an order to develop a set of 12 chips for programmable microcalculators, but Intel engineers found the creation of 12 specialized chips cumbersome and inefficient. The problem of reducing the range of microcircuits was solved by creating a “pair” of semiconductor memory and an actuator capable of operating according to commands stored in it. It was a breakthrough in computing philosophy: a universal logic unit in the form of a 4-bit central processing unit, the i4004, which was later called the first microprocessor. It was a set of 4 chips, including one chip controlled by commands that were stored in semiconductor internal memory.

As a commercial development, the microcomputer (as the chip was then called) appeared on the market on November 11, 1971 under the name 4004: 4 bit, containing 2300 transistors, clocked at 60 kHz, cost $200. In 1972, Intel released the eight-bit microprocessor 8008, and in 1974 - its improved version Intel-8080, which by the end of the 70s became the standard for the microcomputer industry. Already in 1973, the first computer based on the 8080 processor, Micral, appeared in France. For various reasons, this processor was not successful in America (in the Soviet Union it was copied and produced for a long time under the name 580VM80). At the same time, a group of engineers left Intel and formed Zilog. Its most high-profile product is the Z80, which has an extended instruction set of the 8080 and, which ensured its commercial success for household appliances, made do with a single 5V supply voltage. On its basis, in particular, the ZX-Spectrum computer was created (sometimes called by the name of its creator - Sinclair), which practically became the prototype of the Home PC of the mid-80s. In 1981, Intel released the 16-bit processor 8086 and 8088 - an analogue of the 8086, with the exception of the external 8-bit data bus (all peripherals were still 8-bit back then).

A competitor to Intel, the Apple II computer was distinguished by the fact that it was not a completely finished device and there was some freedom left for modification directly by the user - it was possible to install additional interface boards, memory boards, etc. It was this feature, which later came to be called “open architecture,” that became its main advantage. The success of the Apple II was facilitated by two more innovations developed in 1978. Inexpensive floppy disk storage, and the first commercial calculation program, the VisiCalc spreadsheet.

The Altair-8800 computer, built on the Intel-8080 processor, was very popular in the 70s. Although the Altair's capabilities were quite limited - the RAM was only 4 KB, the keyboard and screen were missing, its appearance was greeted with great enthusiasm. It was launched on the market in 1975, and several thousand sets of the machine were sold in the first months.


Representatives of the IV generation of computers: a) Micral; b) Apple II

This computer, developed by MITS, was sold by mail as a kit of parts for self-assembly. The entire assembly kit cost $397, while the Intel processor alone sold for $360.

The spread of PCs by the end of the 70s led to a slight decrease in the demand for large computers and minicomputers - IBM released the IBM PC based on the 8088 processor in 1979. The software that existed in the early 80s was focused on word processing and simple electronic tables, and the very idea that a “microcomputer” could become a familiar and necessary device at work and at home seemed incredible.

On August 12, 1981, IBM introduced the Personal Computer (PC), which, in combination with software from Microsoft, became the standard for the entire PC fleet of the modern world. The price of an IBM PC model with a monochrome display was about $3,000, with a color display - $6,000. IBM PC configuration: Intel 8088 processor with a frequency of 4.77 MHz and 29 thousand transistors, 64 KB of RAM, 1 floppy drive with a capacity of 160 KB, and a regular built-in speaker. At this time, launching and working with applications was a real pain: due to the lack of a hard drive, you had to constantly change floppy disks, there was no “mouse”, no graphical window user interface, no exact correspondence between the image on the screen and the final result (WYSIWYG ). Color graphics were extremely primitive, there was no talk of three-dimensional animation or photo processing, but the history of the development of personal computers began with this model.

In 1984, IBM introduced two more new products. First, a model for home users was released, called the PCjr, based on the 8088 processor, which was equipped with perhaps the first wireless keyboard, but this model did not achieve success in the market.

The second new product is the IBM PC AT. The most important feature: the transition to higher-level microprocessors (80286 with an 80287 digital coprocessor) while maintaining compatibility with previous models. This computer turned out to be a standard-setter for many years to come in a number of respects: it was the first to introduce a 16-bit expansion bus (which remains standard to this day) and EGA graphics adapters with a resolution of 640x350 and 16-bit color depth.

In 1984, the first Macintosh computers were released with a graphical interface, a mouse, and many other user interface attributes that are essential to modern desktop computers. The new interface did not leave users indifferent, but the revolutionary computer was not compatible with previous programs or hardware components. And in the corporations of that time, WordPerfect and Lotus 1-2-3 had already become normal working tools. Users have already become accustomed and adapted to the DOS character interface. From their point of view, the Macintosh even looked somehow frivolous.

Fifth generation of computers (from 1985 to the present time)

Distinctive features of the V generation:

  1. New production technologies.
  2. Refusal of traditional programming languages ​​such as Cobol and Fortran in favor of languages ​​with increased capabilities for manipulating symbols and elements of logic programming (Prolog and Lisp).
  3. Emphasis on new architectures (e.g. data flow architecture).
  4. New user-friendly input/output methods (e.g., speech and image recognition, speech synthesis, natural language message processing)
  5. Artificial intelligence (that is, automation of problem solving processes, drawing conclusions, manipulating knowledge)

It was at the turn of the 80-90s that the Windows-Intel alliance was formed. When Intel released the 486 microprocessor in early 1989, computer makers didn't wait for IBM or Compaq to lead the way. A race began, in which dozens of companies entered. But all the new computers were extremely similar to each other - they were united by compatibility with Windows and processors from Intel.

In 1989, the i486 processor was released. It had a built-in math coprocessor, pipeline, and built-in L1 cache.

Directions of computer development

Neurocomputers can be classified as the sixth generation of computers. Despite the fact that the real use of neural networks began relatively recently, neurocomputing as a scientific field is now in its seventh decade, and the first neurocomputer was built in 1958. The developer of the car was Frank Rosenblatt, who gave his brainchild the name Mark I.

The theory of neural networks was first outlined in the work of McCulloch and Pitts in 1943: any arithmetic or logical function can be implemented using a simple neural network. Interest in neurocomputing reignited in the early 1980s and was fueled by new work with multilayer perceptrons and parallel computing.

Neurocomputers are PCs consisting of many simple computing elements, called neurons, working in parallel. Neurons form so-called neural networks. The high performance of neurocomputers is achieved precisely due to the huge number of neurons. Neurocomputers are built on a biological principle: the human nervous system consists of individual cells - neurons, the number of which in the brain reaches 10 12, despite the fact that the response time of a neuron is 3 ms. Each neuron performs fairly simple functions, but since it is connected on average to 1–10 thousand other neurons, such a group successfully ensures the functioning of the human brain.

Representative of the VI generation of computers - Mark I

In optoelectronic computers, the information carrier is light flux. Electrical signals are converted to optical and vice versa. Optical radiation as an information carrier has a number of potential advantages compared to electrical signals:

  • Light flows, unlike electrical ones, can intersect with each other;
  • Light fluxes can be localized in the transverse direction of nanometer dimensions and transmitted through free space;
  • The interaction of light fluxes with nonlinear media is distributed throughout the environment, which gives new degrees of freedom in organizing communication and creating parallel architectures.

Currently, developments are underway to create computers entirely consisting of optical information processing devices. Today this direction is the most interesting.

An optical computer has unprecedented performance and a completely different architecture than an electronic computer: in 1 clock cycle lasting less than 1 nanosecond (this corresponds to a clock frequency of more than 1000 MHz), an optical computer can process a data array of about 1 megabyte or more. To date, individual components of optical computers have already been created and optimized.

An optical computer the size of a laptop can give the user the opportunity to place almost all the information about the world in it, while the computer will be able to solve problems of any complexity.

Biological computers are ordinary PCs, only based on DNA computing. There are so few truly demonstrative works in this area that there is no need to talk about significant results.

Molecular computers are PCs whose operating principle is based on the use of changes in the properties of molecules during the process of photosynthesis. During the process of photosynthesis, the molecule takes on different states, so that scientists can only assign certain logical values ​​to each state, that is, “0” or “1”. Using certain molecules, scientists have determined that their photocycle consists of only two states, which can be “switched” by changing the acid-base balance of the environment. The latter is very easy to do using an electrical signal. Modern technologies already make it possible to create entire chains of molecules organized in this way. Thus, it is very possible that molecular computers are waiting for us “just around the corner.”

The history of computer development is not over yet; in addition to improving old ones, completely new technologies are being developed. An example of this is quantum computers - devices that operate on the basis of quantum mechanics. A full-scale quantum computer is a hypothetical device, the possibility of building which is associated with the serious development of quantum theory in the field of many particles and complex experiments; this work lies at the cutting edge of modern physics. Experimental quantum computers already exist; elements of quantum computers can be used to increase the efficiency of calculations on existing instrumentation.

It is simply impossible to imagine modern life without a computer today. Just some 10-12 years ago, not everyone could afford the “miracle” of modern electronics. We are going to trace the evolutionary development of personal computers, as well as identify the key stages in the transition of PCs from the category of “whose means allow” to the category of “publicly available”. In the historical development of computer technology, only eight names of people are noted who made the greatest contribution to the main evolutionary stages of PC production. Over the course of several decades, electronics not only overtook, but also largely replaced mechanics. Not just evolutionary, but revolutionary steps were taken to ensure that in less than a century society became so “addicted” to computers.

Instead of a preface

Perhaps it is simply impossible to imagine modern life without a computer today. And just ten years ago, not everyone could afford the “miracle” of modern electronics. I remember how I had to sit in the library over books, copying what I needed into notes. And these terrible handwritten abstracts, a callus on the middle finger of the right hand...

Unlike the German computer, where the basis was relays, in ENIAC most of the elements were vacuum tubes. It was a real monster, costing almost 500 thousand dollars, occupying an entire room. The device weighed 27 tons, the total number of components: about 17.5 thousand lamps of various types, 7.2 thousand silicon diodes, 1.5 thousand relays, 70 thousand resistors and 10 thousand capacitors. The machine required a power supply of 174 kW. Computing power – 357 multiplication operations or 5 thousand addition operations per second. Basic calculations - decimal number system. The computer easily worked with numbers 20 digits long.

Despite its computational superiority, ENIAC had a lot of shortcomings. For example, if at least one lamp burned out, the entire computer would fail. The process of programming the computer itself was also lengthy: solving a problem took several minutes, when entering data could take several days.

ENIAC never became widespread; the device was produced in a single copy and was not used anywhere in the future. But some of the principles that were based on the design of ENIAC were subsequently reflected in more advanced models of electronic computing technology.

"Made in USSR"

In 1951, a small electronic calculating machine - MESM - was created on the territory of the Ukrainian SSR. It contained 6 thousand electronic tubes; it barely fit in the left wing of the dormitory building of the former monastic village of Feofaniya (10 km from Kyiv). MESM was created in the laboratory of computer technology of the Institute of Electrical Engineering of the Academy of Sciences of the Ukrainian SSR under the leadership of academician S.A. Lebedeva.

Lebedev’s thoughts about creating a computer with superpowers appeared back in the 30s, when the young scientist was engaged in research on the stability of power systems. But the wars that broke out in the 40s forced all endeavors to be abandoned for a while.

In 1948, Lebedev, together with a group of engineers, moved to Feofaniya (one of the departments of the Institute of Economics of the Academy of Sciences of the Ukrainian SSR) and began three years of work on the implementation of a secret project to create the first domestic computer.

“The machine occupied a room of 60 square meters. The MESM worked at a speed unprecedented at that time - 3 thousand operations per minute (modern computers produce millions of operations per second) and could perform operations on subtraction, addition, multiplication, division, shifts, comparison based on sign, comparison based on absolute value, transfer of control , transmitting numbers from a magnetic drum, adding commands. The total power of vacuum tubes is 25 kW.”

After a series of tests, S.A. Lebedev proved that his machine is “smarter than a person.” This was followed by a series of public demonstrations and the conclusion of an expert commission on the introduction of MESM into operation (December 1951).

MESM was practically the only computer in the country that solved various scientific and technical problems in the field of thermonuclear processes, space flights and rocketry, long-distance power lines, mechanics, and statistical quality control. One of the most important problems solved at MESM was the calculation of the stability of parallel operation of units of the Kuibyshev hydroelectric power station, determined by a system of nonlinear second-order differential equations. It was necessary to determine the conditions under which the maximum possible power could be transmitted to Moscow without compromising the stability of the system. In connection with the rapid development of jet and rocket technology, the machine was tasked with calculating external ballistics of varying complexity, ranging from relatively simple multivariate calculations of trajectories passing within the earth's atmosphere with a slight difference in altitude to very complex ones associated with the flight of objects outside the earth's atmosphere .

MESM was used in many research projects right up to 1957, after which the machine was dismantled and disassembled into parts. The equipment was delivered to the Kiev Polytechnic Institute for laboratory work.

The first computers with data storage capabilities

As mentioned earlier, some of the first electronic computing systems became prototypes for the creation of more advanced computerized devices. The main task of the developers of new computers was related to providing machines with the ability to store processed and received data in electronic memory.

One of these machines is called “The Manchester Baby”. In 1948, at the University of Manchester (UK), an electronic computing device capable of storing data in internal random access memory was developed, and a year later put into operation. Manchester's Mark 1 was an improved version of the Neumann computer.

The device could not only read information from punched tapes, but also had the ability to input and output data from a magnetic drum directly while the program was running. The “memory” system was a chain of Williams cathode ray tubes (patented in 1946).

The “Manchester Child” had completely “non-childish” dimensions: 17 m in length. The system consisted of 75 thousand vacuum tubes, 3 thousand mechanical relays, 4 Williams tubes (computer memory 96 40-bit words), a magnetic drum (1024-4096 40-bit words), a processor with 30 instructions and a battery system. The machine required from 3 to 12 seconds for the simplest mathematical operations.

In 1951, the “Child” was disposed of, and its place was replaced by a full-fledged commercial computer, the Ferranti Mark 1.

Around the same period, in Cambridge (UK), a group of engineers led by Maurice Wilkes created a computer with a program stored in memory - EDSAC (Electronic Delay Storage Automatic Computer). This device becomes the first widely used electronic computing device with internal memory capabilities.

The computer used almost 3 thousand vacuum tubes. The main memory of the computer is 1024 memory cells: 32 mercury ultrasonic delay lines (HULZ), each of which stored 32 words of 17 bits, including the sign bit. It was possible to include additional delay lines, which made it possible to work with words of 35 binary digits. Calculations were carried out in a binary system at a speed of 100 to 15 thousand operations per second. Power consumption - 12 kW, occupied surface area - 20 square meters.

In 1953, under the leadership of Wilkes and Renwick, work began on the second computer model, the EDSAC-2. Elements on ferrite cores with a total capacity of 1024 words were already used as RAM (random access memory). The new car has a ROM (read only memory) - first on a diode and then on a ferrite matrix. But the main innovation was the use of microprogram control: some of the commands could be composed of a set of micro-operations; microprograms were recorded in permanent memory. This computer was used until 1965.

"Transistor" story

The beginning of the era of computers “for life” is associated with the same IBM. After a change of management in 1956, the company also changed its production vector. In 1957, IBM introduced the FORTRAN language (“FORmula TRANslation”), which was used for scientific computing. In 1959, the first transistor-based IBM computers appeared, reaching such a level of reliability and speed that they began to be used by the military in air defense early warning systems. In 1964, the entire IBM System/360 family was introduced. They became: the first designed family of computers, the first universal computers, the first computers with byte-addressable memory (the list of primacy does not end there). IBM System z computers compatible with System/360 are still being produced, this is an absolute record for compatibility.

The evolutionary development of computer technology included: a reduction in size, a transition to more advanced components, an increase in computing power, an increase in the amount of RAM and permanent storage, the possibility of widespread use in various industries, as well as the possibility of personalizing a computer.

In the 50-60s of the twentieth century, transistor computers replaced tube computers. Semiconductor diodes and transistors are used as the main element, and magnetic cores and magnetic drums (distant ancestors of modern hard drives) are used as memory devices. The second difference between these computers: it became possible to program in algorithmic languages. The first high-level languages ​​(Fortran, Algol, Cobol) were developed. These two important improvements made writing computer programs much easier and faster. Programming, while remaining a science, becomes more applied. All this led to a reduction in size and a significant reduction in the cost of computers, which then began to be built for sale for the first time.

The production capacity of these computers is up to 30 thousand operations per second. The amount of RAM is 32 KB. Big advantages are reduced dimensions and reduced energy consumption. Programming transistor computers becomes the basis for the emergence of so-called “operating systems.” It becomes easier to work with the device, which is possible not only for scientists, but also for less “advanced” users. Computer equipment appears in factories and offices (mainly in accounting).

Among the transistor electronic computing devices of this period, the most famous are:

Early 50s. The most powerful computer in Europe is the Soviet M-20 with an average speed of 20 thousand 3-address commands per second over 45-bit floating point numbers; its RAM was implemented on ferrite cores and had a volume of 4096 words.

1954-1957. NCR (USA) produces the first computer using transistors – NCR-304;

1955 The Bell Telephone Laboratories transistor computer, TRADIS, contains 800 individual transistor elements;

1958 NEC Corporation develops the first Japanese computer, NEC-1101 and 1102;

Note that these are not the only representatives of the “transistor” history in the evolution of computers. During this period, developments were carried out at the Massachusetts Institute of Technology (USA), in many scientific and technical laboratories throughout the Soviet Union, and in leading European research and technological higher schools.

Microchips and mass production

It only took the developers a few years to produce a computer with new components. Just as transistors replaced vacuum tubes (and they replaced mechanical relays), so microcircuits occupied their evolutionary cell. The end of the 60s of the twentieth century brings the following metamorphoses to computers: integrated circuits were developed, consisting of a chain of transistors combined under one semiconductor; semiconductor memory appears, which becomes the main element of computer RAM; mastered the method of simultaneous programming of several tasks (the principle of dialogue mode); the central processor can work in parallel and control various peripheral devices; opens the possibility of remote access to computer data.

It was during this period that the “famous” family of IBM computers appeared. The production of electronic computer equipment is moving onto the conveyor belt, and mass production of computerized equipment is being established.

Of course, there is more to say about the IBM System/360 (S/360). In 1964, the company released a series of computers of different sizes and functionality. Depending on the requirements, both small machines with low productivity and large machines with higher production rates can be equally used in production. All machines run on similar software, so if you have to replace a low-power device with a more advanced one, you do not need to rewrite the main program. To ensure compatibility, IBM is pioneering the use of microcode technology, which is used in all but the highest-end models in the series. This series of computers becomes the first derivative when a clear distinction is made between the architecture and implementation of the computer.

S/360 cost the company $5 billion (a colossal expense by 1964 standards). But this system still does not become the most expensive production; the primacy remains with the R&D project. The 360 ​​is being replaced by the 370, 390 and System z, but they retain the same computer architecture. Based on S/360, other companies produce their own model series, for example, the 470 family from Amdahl, Hitachi mainframes, UNIVAC 9200/9300/940, Soviet machines of the ES computer series, etc.

Thanks to the widespread use of the IBM/360, the 8-bit characters and 8-bit byte invented for it as the minimum addressable memory cell became the standard for all computer equipment. Also, the IBM/360 was the first 32-bit computer system. The older models of the IBM/360 family and the IBM/370 family that followed them were among the first computers with virtual memory and the first mass-produced computers to support the implementation of virtual machines. The IBM/360 family was the first to use microcode to implement individual processor commands.

But some microprocessor systems had one drawback - the low quality of components. This was especially pronounced in Soviet electronic computers. They continued to have significant dimensions and lagged behind Western developments in functionality. To eliminate this, domestic designers had to design special processors to perform specific tasks (which excluded the possibility of multiprogramming).

The first minicomputers (prototypes of modern computers) also appeared during this period. The most important thing that happened to PCs in the late 60s and early 70s was the transition from a large number of elements to the use of one part, combining all the necessary components. Microprocessors are the heart of any computer. Society owes their appearance to Intel. It was she who owned the first microchip, which became a truly revolutionary and evolutionary leap for computer technology.

Along with the rapid improvement of technical equipment, electronic computing systems are beginning to be combined into local and global computer networks (the prototype of the Internet). The programming language is being improved, and more advanced operating systems are being written.

Supercomputers and Personal Portable Electronics

The seventies and eighties became the main period of mass production of computers for general consumption. There were no significant innovations during this period. Electronic computing technology is divided into two camps: supermachines with incredible computing capabilities and more personalized systems. The elemental base of these systems is large integrated circuits (LSI), where more than a thousand elements are placed in one chip. The power of such computers is tens of millions of operations per second, and the amount of RAM increases to several hundred megabytes.

Computerized computing systems used in production remain complex, but mass leadership is moving to personal computers. It was during this period that the term “electronic computer” was replaced by the term “computer”, which is familiar to our ears.

The era of personal computers begins with Apple, IBM-PC (XT, AT, PS /2), Iskra, Elektronika, ES-1840, ES-1841 and others. These systems are inferior in functionality to supercomputers, but due to the consumer purpose of the PC, it is firmly established in the market: the device becomes generally available, a number of innovations appear that simplify the work with the device (graphical user interface, new peripheral devices, global networks).

After the release of the Intel 4004 and Intel 8008 microprocessors, the technology was picked up by other companies: MPs were produced both based on the Intel project and their own modifications.

This is where the young Apple Computer Company of Steve Jobs and Steve Wozniak appears on the scene with its first personal product - the Apple-1 computer. Not many ambitious entrepreneurs were interested in the development. There was only one order for a batch of Apple-1 computers: Paul Terrell, owner of the Byte computer store, orders a shipment of 50 units of the product. But the conditions are as follows: these must not be just computer boards, but completely complete machines. Overcoming difficulties with financing production, Apple Computer nevertheless manages to fulfill its obligations on time, and Apple-1 appears on the shelves of Terrell's store. True, without “ammunition”, but only in the form of payment, but Terrell agrees to the supply and pays the promised $500 per unit of goods.

Note that most PCs of that time were supplied as separate components, the assembly of which was carried out by distributors or end customers.

So, in 1976, Apple 1 goes on sale for $666.66 apiece. The Apple I was completely assembled on a circuit board containing about 30 chips, which is why it is considered by many to be the first full-fledged PC. But to get a working computer, users had to add a case, power supply, keyboard and monitor. An additional board, released later at a cost of $75, provided communication with a cassette recorder for data storage.

Many experts do not consider the Apple computer to be the first personal electronic device, but call it the Altair 8800 microcomputer, which was created by Ed Roberts and distributed through catalogs in 1974-1975. But in fact, this device did not meet all user requirements.

The company continues production, and the updated Apple II model goes on sale. This series of PCs was equipped with a 1 MHz MOS Technology 6502 processor, 4 KB of RAM (expandable to 48 KB), 4 KB of ROM, a monitor and an Integer BASIC interpreter, and an interface for connecting a cassette recorder. Apple II becomes the most widely sold device on the electrical market (more than 5 million units of this product were sold over the years of production). The Apple II looked more like an office tool than a piece of electronic equipment. It was a complete computer, suitable for a home environment, a manager's desk, or a school classroom.

To connect a monitor (or TV), a composite video output in NTSC format was used. Computers sold in Europe used an additional PAL encoder located on an expansion card. The sound was provided by a speaker controlled through a register in memory (1 bit used). The computer had 8 expansion connectors, 1 of which allowed you to connect additional RAM, while the rest were used to provide I/O (serial and parallel ports, external device controllers). The initial retail price of the computer was $1,298-$2,638 per model modification.

The Apple II acquired a family and until the early 90s retained its leadership in the computer equipment market.

General PC Standard

At the end of 1980, IBM decided to produce its own PC. The supply of microprocessors for future IBM PC models is entrusted to Intel, and the project of Harvard dropout Bill Gates - the PC-DOS operating system - is adopted for the main OS.

The company not only sets production rates, but also sets its own standards for computer production. Each PC manufacturer could purchase a license from IBM and assemble similar computers, and microprocessor manufacturers could manufacture elements for them (in fact, only Apple managed to maintain its own architecture). This is how the IBM PC XT model with a hard drive appears. Following it is the IBM PC AT, built on the MP 80286.

1985 was marked by the release of high-performance PCs; Intel and Motorola jointly produced the 80386 and M68020 microprocessors. From year to year, computer modifications are improved, the names of IBM and Intel are constantly heard. New microprocessors achieve incredible data processing power - up to 50 million operations per second. In 1993, Intel released the P5 Pentium MP with a 64-bit architecture, followed by models 2 and 3. The Pentium 4 is already equipped with HT technology, which allows it to process information using 2 parallel threads.

Computers are improving in everything: energy consumption is decreasing, dimensions are decreasing, but computing power is increasing enormously, the amount of RAM is increasing (up to 4 gigabytes), and the volume of hard drives is calculated in terabytes.

Almost all computers produced in the world are switching to the new “window” operating system MicroSoft “Windows” and office applications MS-Office. This is how personal computer computer standards are defined: the IBM PC architecture and the Windows OS.

As for the size of the PC, along with desktop computers, portable portable electronics are produced: laptops, netbooks, then tablets and smartphones (phone-computer).

Instead of an afterword

Over the course of several decades, personal computers have moved from electronic “calculating machines” to everyday equipment. Now a PC is not just an electronic computing device. This is an entire industry of knowledge, entertainment, work, education and other consumer opportunities.

Mikhail Polyukhovich

The personal computer (PC) has greatly changed humanity's relationship with computing resources. With each new PC model, people transferred more and more functions to the shoulders of the machine, ranging from simple calculations to accounting or design. That is why malfunctions, failures, and downtime of computer technology have become not just unwanted misunderstandings, but a real disaster that can lead to direct economic losses and other unacceptable consequences.

The first milestones in the development of personal computers


In the second half of the 20th century, only large companies had computers, and not only because of the high price of the equipment, but also because of its impressive size. Therefore, enterprises involved in the development and manufacture of computer equipment sought to miniaturize and reduce the cost of their products. As a result, microminiaturization, as well as the widespread development of microcircuits, led to the fact that the computer could fit on a desk, and Xerox introduced the first personal computer Alto in 1973. For the first time, programs and files were displayed on the screen in the form of “windows.”

In 1975, the first commercial PC, Altair-8800, was released, built on the Intel 8080 microprocessor. RAM was 256 bytes. The PC was controlled by a special switch panel. For data input and output, an 8-inch floppy disk drive was installed, which was purchased separately. The first version of the i8080 microprocessor was manufactured in a 48-pin planar package, the maximum clock frequency was 2 MHz. However, the processor had a serious flaw that caused it to freeze. Only the “reset” signal allowed the system to be revived. A corrected and improved version of the processor - 8080A - was released six months later. It was manufactured in a DIP-40 package, and the maximum clock frequency increased to 2.5 MHz.

The beginning of the journey of Apple and Intel


In 1976, Steve Jobs and Steve Wozniak assembled a working computer board called the Apple I in Palo Alto. It was housed in a wooden case and did not have a keyboard or screen. The board contained a processor, 8 KB of RAM, and the ability to display information on the screen.

In 1977, Wozniak and Jobs developed the first complete PC, the Apple II, in a plastic case, with an integrated keyboard, and a TV used as a display. That same year, Commodore introduced a PC called the PET.

In June 1978, Intel created the first 16-bit microprocessor, the i8086. Thanks to the segmented memory organization, it could address up to 1024 KB of RAM. The i8086 used an instruction set that is also used in modern processors. With the advent of the i8086 processor, the x86 architecture became known. The processor clock frequency ranged from 4 to 10 MHz. It should be noted that the 8086 processor gained popularity mainly thanks to the Compaq DeskPro computer.

In 1980, Osborne Computer began producing the first portable PCs, which had the dimensions of a suitcase and weighed 11 kg.

IBM's first steps


In 1981, IBM released the IBM PC, an open-architecture microcomputer based on Intel's 16-bit 8088 microprocessor. The 16-bit i8088 processor with an 8-bit data bus had a clock speed of 5 to 10 MHz. The PC was equipped with a monochrome text display, two 160 KB 5-inch floppy disk drives and 64 KB RAM.

In 1983, the IBM PC XT (extended Technology) computer appeared, which had 256 KB of RAM and a 10 MB hard drive. The processor clock frequency was 5 MHz.

The IBM PC AT (Advanced Technology) was introduced in 1984. The computer ran on an Intel 80286 microprocessor and ISA architecture, and came with a 20 MB hard drive. The use of the Intel 80286 microprocessor (produced since February 1, 1986) made it possible to switch to the AT bus: 16-bit data bus, 24-bit address bus. It became possible to address RAM up to 16 MB (compared to 640 KB of the original IBM PC model). The motherboard provided a battery to power the microcircuit, and the time was stored in memory (capacity - 50 bytes). Processor clock speed: 80286 – 6 – 6 MHz, 80286 – 8 – 8 MHz, 80286-10 – 10 MHz, 80286 – 12 – 12.5 MHz.

In October 1985, Intel created the first 32-bit microprocessor, the i80386, which included about 275 thousand transistors. The first PC to use this microprocessor was the Compaq DeskPro 386. A cheaper alternative, the 32-bit i80386 processor, which later received the DX suffix, did not appear until June 1988. It was the 386th processor that provided a noticeable increase in the clock speed of personal computers. Different models of 386 processors operated at clock frequencies of 16.20, 25, 33.40 MHz.

Intel's Colossal Breakthrough


In 1989, Intel released the 486DX microprocessor. It had 1.2 million transistors on a single chip and was fully compatible with x86 processors. This chip was the first to combine a central processor, a mathematical coprocessor and cache memory. Clock frequencies of various modifications of 486 processors ranged from 16 to 150 MHz. Computers based on the 486th processor reached a frequency of 133 MHz (the so-called DX4). The 486 DX2 processors had a multiplier of 2 (at a 50 MHz front side bus, the processor frequency was 100 MHz). Later, processors with the DX4 index were produced. Their multiplication factor was not 4, but 3. After Intel's 486 processors left the market, AMD released the 486DX4-120 and 486DX4-133 processors. As a result of the introduction of multipliers, the concept of overclocking arose for the first time - increasing productivity by increasing the bus clock frequency or multiplication factor. There were systems on sale where i486 processors were overclocked to 160 MHz.

In March 1993, Intel began shipping 66 and 60 MHz versions of the Pentium processor. Pentium-based PCs are fully compatible with computers using i8088, i80286, i80386, i486 microprocessors. The new processor contained approximately 3.1 million transistors and had a 32-bit address bus and a 64-bit external data bus.

In May 1997, Intel introduced the Pentium II processor, based on the Pentium Pro. A processing unit for MMX instructions was added to the P6 core. The second level cache was removed from the processor case, and this contributed to the mass distribution of the Pentium II. The clock speeds of Pentium II processors have increased noticeably. Different models had: 233, 266,300, 333,350, 400, 433,450,466, 500, 533 MHz.

The sixth generation Intel Pentium III 32-bit microprocessor was released by Intel in February 1999. It practically copied the Pentium II, but included new features: 70 real instructions SSE (Streaming SIMD Extensions, also called MMX2), focused on multimedia support; Improved L1 cache controller. The clock frequencies of Pentium III (Katmai) processors were 450,500,533, 550,600 MHz. Based on Coppermine - from 533 to 1133 MHz. Pentium III processors on the Tualatin core have speeds from 1000 to 1400 MHz.

The era of multi-core processors


At the end of November 2000, Intel introduced Pentium 4 processors clocked at over 1 GHz, built on the NetBurst architecture and using fast Rambus memory with an effective system bus frequency of 400 MHz. The processors contained 144 additional SSE2 instructions. Clock speeds of the first Pentium 4 processors ranged from 1.4 to 2.0 GHz. In the following modifications, the clock frequency increased from 2.2 to 3.8 GHz.

In July 2006, Intel created dual-core processors – Core 2; the first processors in this line were Intel Core 2 Duo and Intel Core 2 Extreme. The processors were developed based on the new Intel Core architecture, which the company calls the most significant step in the development of its microprocessors since the introduction of the Intel Pentium brand in 1993. Using EM64T technology, Intel Core 2 processors can operate in both 32-bit and 64-bit modes. The main differences between the new processors and the Pentium 4 family are low heat generation and power consumption, as well as greater overclocking capabilities. The frequency of Core 2 Duo processors ranges from 1.5 to 3.5 GHz.

At the beginning of 2007, the Core 2 Quad, a quad-core processor, was introduced. Clock frequencies – from 2.33 to 3.2 GHz.

In January 2010, Intel Core i3 processors appeared. They have added so-called “graphical” processors; they carry out calculations in “graphical” mode. There is a built-in function that provides “intelligence” in operation, auto acceleration. At medium and low loads it operates at rated performance and saves energy. An increase in load causes an automatic increase in processor performance. The size of the cache (internal RAM of the processor) has been increased; it is dynamically distributed between the cores - depending on the load. New processors get hotter, especially during auto overclocking. Accordingly, they require a more efficient cooling system. Clock frequencies of i-Series processors (i3, i5, i7) are from 2.66 to 3.6 GHz.

They appeared after the Second World War, when the discoveries of mathematicians and other scientists made it possible to implement a new way of reading information. And although today these machines seem like outlandish artifacts, they became the ancestors of modern PCs familiar to the average person.

Manchester "Mark I" and EDSAC

The first computer in the modern sense of the word was the Mark I device, created in 1949. Its uniqueness lay in the fact that it was completely electronic, and the program was stored in its RAM. This achievement of British specialists was a great leap forward in the centuries-old history of the development of computers. The Manchester Mark I included Williams tubes and magnetic drums, which served as storage for information.

Today, many years later, the history of the creation of the first computer is controversial. The question of which machine can be called the first computer remains controversial. The Manchester "Mark I" remains the most popular version, although there are other contenders. One of them is EDSAC. Without this machine, the history of the computer as an invention would have been completely different. If "Mark" appeared in Manchester, then EDSAC was created by scientists from the University of Cambridge. This computer went into operation in May 1949. Then the first program was executed on it, which squared the numbers from 0 to 99.

Z4

The Manchester Mark I and EDSAC were program specific. The next step in the evolution of computing machines was the Z4. Last but not least, the device was distinguished by a dramatic history of creation. The computer was created by the German engineer Konrad Zuse. Work on the project began at the final stage. This circumstance greatly slowed down this development. Zuse's laboratory was destroyed during an enemy air raid. Along with it, all equipment and preliminary results of long-term work were lost.

Nevertheless, the talented engineer did not give up. Production continued after the onset of peace. In 1950 the project was finally completed. The history of its creation turned out to be long and thorny. The computer immediately attracted the attention of the Swiss Higher Technical School. She bought the car. Z4 interested specialists for a reason. The computer had universal programming, that is, it was the first multifunctional device of this type.

In the same 1950, the history of the creation of computers in the USSR was marked by an equally important event. At the Kiev Institute of Electrical Engineering, MESM was created - a small electronic calculating machine. A group of Soviet scientists, led by Academician Sergei Lebedev, worked on the project.

The design of this machine included six thousand electric lamps. Greater power made it possible to take on tasks that were previously unprecedented for Soviet technology. In a second, the device could perform about three thousand operations.

Commercial models

At the first stage of computer development, their development was carried out by specialists from universities or other government agencies. In 1951, the LEO I model appeared, created thanks to investments from the British private company Lyons and Company, which owned restaurants and shops. With the advent of this device, the history of computer creation reached another important milestone. LEO I was the first to be used for commercial data processing. Its design was similar to that of its ideological predecessor EDSAC.

The first American commercial computer was UNIVAC I. It appeared in the same 1951. A total of forty-six of these models were sold, each costing a million dollars. One of them was used in the US Census. The device consisted of more than five thousand vacuum tubes. Delay lines made of mercury were used as an information carrier. One of them could store up to a thousand words. When developing UNIVAC I, it was decided to abandon punched cards and switch to metallized magnetic tape. With its help, the device could connect to commercial data storage systems.

"Arrow"

Meanwhile, Soviet electronic ones had their own history of creation. The Strela computer, which appeared in 1953, became the first such serial device in the USSR. The new product was produced on the basis of the Moscow Calculating and Analytical Machines Plant. During three years of production, eight samples were produced. These unique machines were installed at the Academy of Sciences, Moscow State University and design bureaus located in closed cities.

"Strela" could perform 2-3 thousand operations per second. These were record numbers for domestic technology. The data was stored on magnetic tape, which could hold up to 200 thousand words. The developers of the device were awarded. Chief designer Yuri Bazilevsky also became a Hero of Socialist Labor.

Second generation of computers

Transistors were invented back in 1947. At the end of the 50s. they replaced energy-consuming and fragile lamps. With the advent of transistors, computers began a new history of creation. Computers that received these new parts were later recognized as second generation models. The main innovation was that printed circuit boards and transistors made it possible to significantly reduce the size of computers, making them much more practical and convenient.

If previously computers occupied entire rooms, now they have been reduced to the proportions of office desks. This, for example, was the IBM 650 model. But even transistors did not solve another important problem. Computers were still extremely expensive, meaning they were only made to order for universities, large corporations, or governments.

Further evolution of computers

Integrated circuits were invented in 1959. They marked the beginning of the third generation of computers. 1960s became a turning point for computers. Their production and sales have increased significantly. New parts made the devices cheaper and more accessible, although they were still not personal. Mostly these computers were bought by companies.

In 1971, Intel developers released the first microprocessor in history onto the market. On its basis, fourth-generation computers appeared. Microprocesses solved several important problems that had previously been hidden in the design of any computer. One such part performed all the logical and arithmetic operations that were written using machine code. Before this discovery, this function lay on many small elements. The appearance of a single universal part heralded the development of small home computers.

Personal computers

In 1977, Apple, founded by Steve Jobs, introduced the Apple II to the world. Its fundamental difference from any other previous computers was that the device of the young Californian company was intended for sale to ordinary citizens. It was a breakthrough that just recently seemed unheard of. Thus began the history of the creation of personal computers of the computer generation. The new product was in demand until the 90s. During this period, about seven million devices were sold, which was an absolute record at that time.

Subsequent Apple models received a unique graphical interface, a keyboard familiar to modern users, and many other innovations. The same one just made the computer mouse popular. In 1984, he introduced his most successful Macintosh model, which marked the beginning of an entire line that still exists today. Many discoveries of Apple engineers and developers have become the basis for today's personal computers, created by other manufacturers, among others.

Domestic developments

Due to the fact that all the revolutionary discoveries related to computers occurred in the West, the history of the creation of computers in Russia and the USSR remained in the shadow of foreign successes. This was also due to the fact that the development of such machines was controlled by the state, while in Europe and the USA the initiative gradually passed into the hands of private companies.

In 1964, the first Soviet semiconductor computers “Snow” and “Vesna” appeared. In the 1970s Elbrus computers began to be used in the defense industry. They were used in missile defense systems and nuclear centers.

One of the greatest inventions of its time. Billions of people use computers in their daily lives around the world.

Over the decades, the computer has evolved from a very expensive and slow device to today's extremely smart machines with incredible processing power.

No single person is credited with inventing the computer; many believe that Konrad Zuse and his Z1 machine were the first in a long line of innovations that brought us the computer. Konrad Zuse was a German who gained fame for creating the first freely programmable mechanical computing device in 1936. Zuse's Z1 was created with an emphasis on 3 main elements that are still used in modern calculators. Later, Konrad Zuse created the Z2 and Z3.

The first Mark series computers were built at Harvard. MARK was created in 1944, and this computer was the size of a room, measuring 55 feet long and 8 feet high. MARK could perform a wide range of calculations. It became a successful invention and was used by the US Navy until 1959.

The ENIAC computer was one of the most important advances in computing. It was commissioned during World War II by the American military. This computer used vacuum tubes instead of electric motors and levers for fast calculations. Its speed was thousands of times faster than any other computing device at the time. This computer was huge and had a total cost of $500,000. ENIAC was in service until 1955.

RAM or Random Access Memory was introduced in 1964. The first RAM was a metal detecting plate placed next to a vacuum tube that detected differences in electrical charges. It was an easy way to store computer instructions.

There were many innovations in 1940. Manchester developed the telecommunications Research Establishment. It was the first computer to use a stored program, and it became operational in 1948. Manchester MARK I continued to live in 1951 and showed enormous progress.

UNIVAC was built by the creators of ENIAC. It was the fastest and most innovative computer capable of processing many calculations. It was a masterpiece of its time and was highly praised by the public.

IBM, the first personal computer widely used and available to people. The IBM 701 was the first general purpose computer developed by IBM. A new computer language called "Fortran" was used in the new 704 model. The IBM 7090 was also a great success, and dominated the office computer for the next 20 years. In the late 1970s and 1980, IBM developed the personal computer known as the PC. IBM has had a huge influence on the computers used today.

With the growth of the personal computer market in the early and mid-1980s, many companies realized that graphical interfaces were more user-friendly. This led to the development of an operating system called Windows by Microsoft. The first version was called Windows 1.0 and later came Windows 2.0 and 3.0. Microsoft is becoming more and more popular today.

Today, computers are extremely powerful and more affordable than ever. They have practically infiltrated every aspect of our lives. They are used as a powerful communication and trading tool. The future of computers is huge.