Creation of the first computers

abstract

In the discipline ” Informatics

on the topic: The history of the development of computers

Fulfilled

1st year student M-14-9r

Kadyrov Ravshan

_______________

supervisor

Kosova Ya.V.

_______________

_______________

Almaty 2015

The history of the development of computers

Content

1. Preface. 3

2. Calculating and decisive means before the advent of computers .. 3

3. Creation of the first computers. 4

4. Lamp computers .. 8

5. Transistor computers .. 9

6. The era of integrated circuits. eleven

7. Fourth generation. thirteen

Foreword

A computer (electronic computer) (or computer) is a hardware-software computing device implemented on electronic components and performing the actions specified by the program.

The term computer is practically not used today, except in the historical sense.

Computing tools before the advent of computers

Russian abacus

Calculating machine Felix-M

The history of computing is deeply rooted in the distant past, just like the development of mankind. The accumulation of reserves, the division of production, the exchange – all such actions are associated with the account. People used their own fingers, pebbles, sticks and knots to count. The need to find solutions to more and more complex problems and, as a result, more and more complex and time-consuming calculations, has put a person in front of the need to find ways, invent devices that could help him in this. Historically, different countries have developed their own monetary units, measures of weight, length, volume and distance. Transferring from one measurement system to another required calculations, which could most often be performed by specially trained people, who were sometimes invited from other countries. This naturally led to inventions to aid counting.

One of the first devices (VI-V centuries BC) that facilitated calculations can be considered a special board for calculations, called “abacus”. Calculations on it were made by moving pebbles or bones into the recesses of boards made of bronze, stone or ivory. Over time, these boards began to be drawn into several stripes and columns. In Greece, the abacus existed already in the 5th century BC. e., among the Japanese it was called “serobyan”, among the Chinese – “suanpan”.

In ancient Russia, when counting, a device similar to an abacus, called the “Russian shot”, was used. In the 17th century, this device already took on the appearance of the usual Russian account.

At the beginning of the 17th century, when mathematics began to play a key role in science, the need for the invention of a calculating machine was increasingly felt. And in the middle of the century, a young French mathematician and physicist Blaise Pascal created a “summing” machine called Pascalina, which, in addition to addition, also performed subtraction.

In 1670-1680. German mathematician Gottfried Leibniz designed a calculating machine that performed all arithmetic operations. Over the next two hundred years, several more similar counting devices were invented and built, which, however, due to their shortcomings, including slowness in operation, were not widely used.

Only in 1878, the Russian scientist P. Chebyshev proposed a calculating machine that performed addition and subtraction of multi-digit numbers. The adding machine, designed by the St. Petersburg engineer Odner in 1874, gained the greatest popularity then. The design of the device turned out to be very successful, as it made it possible to quickly perform all four arithmetic operations.

In the 30s of the 20th century, a more advanced adding machine, the Felix, was developed in the Soviet Union. These counting devices have been used for several decades, becoming the main technical means of facilitating human labor. Produced from 1929 to 1978.

Creation of the first computers

In 1812, the English mathematician and economist Charles Babbage began work on the creation of the so-called “difference” machine, which, according to his ideas, was not only supposed to perform arithmetic operations, but to carry out calculations according to a program that specifies a specific function. As the main element of his machine, Babbage took a gear wheel to remember one digit of a number (there were 18 such wheels in total). By 1822, the scientist had built a small working model and calculated a table of squares on it.

In 1834, Babbage set about building an “analytical” machine. His project contained over 2,000 drawings of various assemblies. Babbage’s machine was intended to be a purely mechanical, steam powered device. It consisted of a storage for numbers (“warehouse”), a device for performing arithmetic operations on numbers (Babbage called it a “factory”) and a device that controlled the operations of the machine in the desired sequence, including the transfer of numbers from one place to another; means for input and output of numbers were provided. Babbage worked on the creation of his machine until the end of his life (he died in 1871), managing to make only some of the components of his machine, which turned out to be too complicated for that level of development of technology.

In 1842, a small manuscript of the Italian military engineer L.F. Menabrea “Essay on the Analytical Engine Invented by Charles Babbage” was published in Geneva, later translated by Babbage’s student and assistant, the daughter of J. G. Byron, Lady Ada Lovelace. With the assistance of Babbage, Ada Lovelace wrote the first programs for solving systems of two linear equations and for calculating Bernoulli numbers. Lady Lovelace became the world’s first female computer programmer.

After Babbage, a significant contribution to the development of counting automation technology was made by the American inventor G. Hollerith, who in 1890 first built a manual puncher for applying digital data to punched cards and introduced mechanical sorting for arranging these punched cards depending on the place of punching. He built a machine – a tabulator, which probed the holes on punched cards, perceived them as the corresponding numbers and counted them. Hollerith tabulators were used in the census in the USA, Austria, Canada, Norway and other countries. They were also used in the first All-Russian population census in 1897, and Hollerith came to Russia to organize this work. In 1896, Hollerith founded the world-famous Computer Tabulating Recording company, specializing in the production of punched machines and punched cards. Later, the company was transformed into International Business Machines (IBM), which has now become a leading developer of computers.

A new tool – a computer – has been serving a person for only a little more than half a century. The computer is one of the greatest inventions of the middle of the 20th century, which has changed human life in many of its manifestations. Computer technology has become one of the levers that ensure the development and achievement of scientific and technological progress.

The German scientist K. Zuse is considered the first creator of an automatic computer. He began his work in 1933, and in 1936 he built a model of a mechanical computer that used a binary number system, a form of representing numbers with a “floating” point, a three-address programming system and punched cards. Zuse chose relays as an element base, which by that time had long been used in various fields of technology. In 1938, Zuse made a 16-word model of the Z1 machine; the following year, the Z2 model, and two years later he built the world’s first operating computer with program control (the Z3 model), which was demonstrated at the German Aviation Research Center. It was a relay binary computer with a memory of 64 22-bit floating point numbers: 7 bits for the exponent and 15 bits for the mantissa. Unfortunately, all these models of cars were destroyed during the bombing during the Second World War. After the war, Zuse made the Z4 and Z5 models. K. Zuse in 1945 created the Plankalkul language (from the German “calculus of plans”), which refers to the early forms of algorithmic languages. This language was more machine-oriented, but surpassed ALGOL in some features.

Regardless of Zuse, D. Stibitz and G. Aiken were engaged in the construction of relay automatic computers in the USA.

D. Stibitz, then working at Bell, assembled the first summing circuits on telephone relays. In 1940, together with S. Williams, Stibitz built the “complex number calculator,” or relay interpreter, which would become known as the “Bell Model 1” specialized relay computer. In the same year, the machine was demonstrated at a meeting of the American Mathematical Society, where its first industrial tests were carried out. In subsequent years, four more models of this machine were created. The last of these, developed by Stibitz in 1946 (Model V), was a general-purpose computer containing 9000 relays and occupying an area of almost 90 m2, the weight of the device was 10 tons.

Another idea of a relay computer was put forward in 1937 by G. Aiken, a graduate student at Harvard University. IBM was interested in his idea. A team of engineers led by K. Lake was connected to help Aiken. Work on the design and construction of the machine, called “Mark-1”, began in 1939 and lasted 5 years. The machine consisted of standard parts manufactured by IBM at the time. Electronic tubes were first used in the creation of a computer by the American professor of physics and mathematics D. Atanasov. Atanasov worked on the problem of automating the solution of large systems of linear equations. In December 1939, Atanasov finally formulated and put into practice his main ideas, creating, together with K. Berry, a working desktop model of the machine. After that, he set about building a machine capable of solving a system with 29 unknowns. The memory of the machine was energy-intensive – 1632 paper capacitors were used. A total of 300 vacuum tubes were used. By the spring of 1942, when the installation of the machine was almost completed, the United States was already at war with Germany, and, unfortunately, the project was curtailed.

In 1942, Professor of the Moore School of Electrical Engineering at the University of Pennsylvania, D. Mauchli, presented the project “Using High-Speed Electronic Devices for Computing”, which marked the beginning of the creation of the first electronic computer ENIAC. For about a year, the project lay dormant until the US Army Ballistic Research Laboratory became interested in it. In 1943, under the leadership of D. Mauchli and D. Eckert, work began on the creation of ENIAC, the demonstration took place on February 15, 1946. The new machine had “impressive” parameters: 18,000 vacuum tubes, an area of 90 × 15 m2, weighed 30 tons and consumed 150 kW. ENIAC ran at 100kHz and performed additions in 0.2ms and multiplications in 2.8ms, three orders of magnitude faster than relay machines could do. In terms of its structure, the ENIAC computer resembled mechanical computers.

For a long time it was believed that ENIAC was the only electronic computer, but in 1975 the UK reported that since December 1945, the first programmable computer “Colossus” had been working at the Bletchley Park State Institute, but England did not provide much data for a correct assessment of the computer.

From the point of view of computer architecture with a stored program, the ideas of the American mathematician, a member of the National Academy of Sciences of the USA and the American Academy of Arts and Sciences, John von Neumann (1903-1957), were revolutionary. These ideas were outlined in the article “A Preliminary Consideration of the Logical Design of an Electronic Computing Device”, written with A. Burks and G. Goldstein and published in 1946.

Von Neumann architecture

This is how von Neumann imagined his computer

The machine must consist of the main organs: the organ of arithmetic, memory, control and communication with the operator so that the machine does not depend on the operator.

It must remember not only digital information, but also the commands that control the program, which must perform operations on numbers.

The computer must distinguish between the numeric command code and the numeric code of the number.

The machine must have a control body for executing instructions stored in memory.

It must also have an arithmetic organ for performing arithmetic operations.

And, finally, it should include an input-output organ.

In 1945, England began to create the first machine with the Neumann type of memory. The work was led by T. Kilbrn from the University of Manchester and F. Williams from the University of Cambridge. Already on June 21, 1948, T. Kilbrn and F. Williams calculated the first program on the Mark-1 computer (the same name as Aiken’s machine).

On May 6, 1949, another group led by M. Wilks made the first calculations on a machine of the same type – EDSAC.

Soon more EDVAC machines (1950), BINAC and SEAC were built.

In November of the same year, the first Soviet computer, MESM, was created at the Kyiv Laboratory of Modeling and Computer Engineering of the Institute of Electrical Engineering of the Academy of Sciences of the Ukrainian SSR under the guidance of Academician S. A. Lebedev. MESM was a fundamentally new machine, since Professor Lebedev applied the principle of parallel processing of words.

Lamp computers

The development of the first series of the UNIAC (Universal Automatic Computer) electronic machine began around 1947. D. P. Eckert and D. Mauchly, who founded the company Eckert-Mauchly. The first UNIAC-1 sample was built for the US Census Bureau in 1951. UNIAC was created on the basis of the ENIAC and EDVIAC computers. Operated at a clock frequency of 2.25 MHz and contained about 5,000 vacuum tubes. Memory capacity – 1000 12-digit decimal numbers.

The next step was to increase the speed of memory, for which scientists began to study the properties of ferrite rings. For the first time, magnetic core memory was used in the Whirlwind-1 machine. It consisted of two cubes with 32 × 32 × 17 cores, providing storage for 2048 words for 16-bit binary numbers.

IBM also joined the development of electronic computers, which in 1952 released the first industrial computer IBM-701. The machine contained 4,000 vacuum tubes and 12,000 germanium diodes. In 1956, IBM released a new serial computer – IBM-704, which was distinguished by its high speed.

After the IBM-704 computer, the IBM-709 machine was released, architecturally approaching the second and third generation machines.

In 1956, IBM developed floating magnetic heads on an air cushion, the invention of which made it possible to create a new type of memory – disk storage devices (SD). Disk memory first appeared in the IBM-305 and RAMAC-650 machine, which had a pack of 50 magnetically coated metal disks spinning at 1200 rpm. On the surface of the disk there were 100 tracks for recording data of 10,000 characters each.

Following the first serial computer UNIAC-1, the REMINGTON-RAND company in 1952 released the UNIAC-1103 computer, which worked 50 times faster.

In October 1952, a group of REMINGTON-RAND employees proposed an algebraic form for writing algorithms; Based on this, US Navy officer and programmer team leader Captain Grace Hoppert developed the first A-0 compiler program.

IBM also took the first steps in the field of programming automation, creating in 1953 for the IBM-701 machine “Quick Coding System”. In 1957, the group of D. Backus completed work on FORTRAN, which later became a popular high-level programming language. He contributed to the expansion of the field of computers.

In 1951, Ferranti began to produce the Mark-1 car. And after 5 years, the Pegasus computer was released, using the concept of general-purpose registers.

In the USSR in 1948, the problems of the development of computer technology became a national task.

In 1950, at the Institute of Precision Mechanics and Computer Technology (ITM and CT of the Academy of Sciences of the USSR), a digital computer department was organized to develop and create a large computer. This work was headed by S. A. Lebedev (1902-1974). In 1951, the BESM machine was designed here, and in 1952 its operation began. The project initially proposed using Williams tubes, but until 1955, mercury lines were used as a memory element. BESM could perform 8,000 op/s. It began to be mass-produced since 1956 under the name BESM-2.

Transistor computers

In the mid 50s. XX century, when tube computers reached “saturation”, a number of firms announced work on the creation of transistor computers. Initially, this caused skepticism due to the fact that semiconductors would be difficult and expensive to manufacture. However, this did not happen – methods for the production of transistors were constantly improved.

In 1955, the United States announced the creation of a digital computer TRADIC, built on 800 transistors and 11,000 germanium diodes. In the same year, the company announced the creation of a fully transistorized computer. The first such machine “Philco-2000” was made in November 1958, it contained 56 thousand transistors, 1,200 diodes, but still it included 450 vacuum tubes. Philco-2000 performed addition in 1.7 microseconds, multiplication in 40.3 microseconds.

In England, the transistor computer “Elliot-803” was released in 1958, in Germany – “Siemens-2002” and in Japan H-1 – in 1958, in France and Italy – in 1960. In the USSR, a group of developers headed by E. L. Brusilovsky in 1960 at the Research Institute of Mathematical Machines in Yerevan completed the development of the Razdan-2 semiconductor computer, its serial production began in 1961.

At the same time, computers appeared and not on semiconductors. So, in Japan, the Senac-1 computer was released on parametrons, in the USSR – “Setun”, and in France – CAB-500 on magnetic elements. “Setun”, developed at Moscow State University under the leadership of N. P. Brusentsov, became the only serial computer that worked in the ternary number system.

A significant event in the design of second-generation machines was the Atlas computer (released in England in 1961), in which the concepts of virtual (apparent) memory, Stretch and CDC-6600 (USA) and BESM-6 (USSR) were applied.

In 1960, IBM developed a powerful computing system “Stretch” (IBM-7030), the developers of which achieved a 100-fold increase in speed: it included 169 thousand drift transistors with a switching clock frequency of 100 MHz.

A great contribution to the development of computers of the second generation was made by Control Data, which developed the CDC-6600 computer in 1960 (the first sample was installed in Los Angeles in 1964). The CDC-6600 architecture used a new solution – multiprocessing: multiple arithmetic logic units (ALUs) with ten peripheral processors, which provided the machine with a performance of more than 3 million ops / s.

In the USSR, after the release of the first serial computer of the second generation “Razdan-2”, about 30 more models were developed using the same technology. Minsk plant of computer technology. Sergo Ordzhonikidze in 1963 the first transistorized computer “Minsk-2” was released, and then its modifications: “Minsk-22”, “Minsk-22M”, “Minsk-23” and in 1968 – “Minsk-32”, which have long played a major role in the automation of various sectors of the national economy.

At the Institute of Cybernetics of the Academy of Sciences of the Ukrainian SSR under the leadership of V. M. Glushkov in the 60s. of the twentieth century, a number of different small machines were developed: “Promin” (1962), “Mir”, “Mir-1” (1965) and “Mir-2” (1969) – subsequently used in universities and scientific research organizations.

In 1964, small computers of the Nairi series were also created in Yerevan, which differ from the Mir computers in some structural features.

In the same year, a series of Ural machines was developed and put into production in Penza (chief designer B. I. Rameev), later in 1965 and 1967. modifications appeared – “Ural-11” and “Ural-16”. Computers of the Ural series had a unified communication system with peripheral devices.

The BESM-6 machine consisted of 60 thousand transistors and 200 thousand semiconductor diodes, had high reliability and high speed – 1 million op / s.

With the advent of second-generation computers, developers began to develop and create programming languages that provide a convenient set of programs.

One of the first programming languages was ALGOL (created by a group of scientists from the American Association for Computing Machinery).

The era of integrated circuits

In December 1961, a special committee of IBM, having studied the company’s technical policy in the development of computer technology, presented a plan-report for the creation of computers on a microelectronic basis. The implementation of the plan was headed by two leading developers of the company – D. Amdal and G. Blau. Working with the problem of the production of logic circuits, they proposed to use hybrid integrated circuits when creating a family, for which the company opened an enterprise for their production in 1963.

In early April 1964, IBM announced the creation of six models of its IBM-360 (“System-360”) family, the appearance of which marked the appearance of third-generation computers. Over the 6 years of the existence of the family, IBM has launched more than 33 thousand machines. The cost of research work amounted to about half a billion dollars (by the standards of that time – the amount was simply huge). When creating the System-360 family, developers encountered difficulties in creating an operating system that was supposed to be responsible for the efficient placement and use of computer resources. The first of these, a universal operating system called DOS, designed for small and medium computers, was later released the operating system OS / 360 – for large ones. Until the end of the 60s. IBM has released a total of more than 20 models of the IBM-360 family. In model 85, for the first time in the world, cache memory was used (from the French cache – cache), and model 195 became the first computer on monolithic circuits.

At the end of 1970, IBM began to release a new family of computers – IBM-370, which retained its compatibility with the IBM-360, but also had a number of changes: they were convenient for completing multi-machine and multi-processor computing systems operating on a common RAM field.

Almost simultaneously with IBM, third-generation computers began to be produced by other companies. In 1966-1967. they were produced by firms in England, Germany and Japan. In England, the ICL company founded the production of the System-4 family of machines (productivity from 15 to 300 thousand op / s). In Germany, Siemens 4004 series machines were produced (the machines of this family completely copied the computers of the Spectra-70 family), and in Japan, the Hytac-8000 series machines developed by Hitachi (this family was a modification of the Spectra-70 family) ). Another Japanese company Fujitsu in 1968 announced the creation of a series of computers “FACOM-230”. In Holland, Philips Gloeilampenfabriken, founded in 1968 to produce computers, began to produce computers of the P1000 series, comparable to the IBM-360.

In December 1969, a number of countries (the People’s Republic of Belarus, Hungary, the GDR, Poland, the USSR and Czechoslovakia, as well as Cuba in 1972, and the SRR in 1973) signed an agreement on cooperation in the field of computing technologies. At the ESEVM-73 exhibition (1973), the first results of this cooperation were shown: six third-generation computer models and several peripheral devices, as well as four operating systems for them. Since 1975, the production of new upgraded models EC-1012, EC-1022, EC-1032, EC-1033 began, with the best performance / cost ratio, in which new logic circuits and semiconductor memory circuits were used. Soon the machines of the second series of cooperation appeared. Its most prominent representative was the powerful EC-1065 model, which was a multiprocessor system consisting of four processors and having 16 MB of memory. The machine was made on integrated circuits IS-500 and had a capacity of 4-5 million op/s.

Another significant event is associated with third-generation machines – the development and implementation of visual input-output devices for alphanumeric and graphic information using cathode-ray tubes – displays, the use of which made it quite easy to realize the possibilities of variant analysis. The history of the appearance of the first prototypes of modern displays dates back to the post-war years. In 1948, G. Fuller, an employee of the laboratory of computer technology at Harvard University, described the design of the numeroscope. In this device, under the guidance of a computer, digital information appeared on the screen of a cathode-ray tube. The display fundamentally changed the process of input-output data and simplified communication with the computer.

In the 70s. In the 20th century, thanks to the advent of microprocessors, it became possible to buffer both the data received from the screen terminal and the data transmitted by the computer. Thanks to this, it was possible to implement the regeneration of the image on the screen by means of the terminal itself. It became possible to edit and control data before transferring it to a computer, which reduced the number of errors. A cursor appeared on the screen – a movable mark that initializes the place for entering or editing a character. The display screen is in color. It became possible to display complex graphic images on the screen – this made it possible to create colorful games (although the first computer games appeared back in the 1950s, but were pseudographic) and programs designed to work with graphics.

fourth generation

8. Apple II

IBM 5150PC

Macintosh 128k

Amiga A1000

ZXSpectrum48k

Commodore Computers of the 1980s

Atari 1040 STF + SM 124

Architecture 80486DX2

Intel Pentium architecture

This generation of computers is associated with the development of microprocessor technology. In 1971, Intel released the Intel-4004 chip – the first microprocessor and the ancestor of the dominant and most famous family today (Intel x86 (the first Intel 8086 microprocessor)).

The history of the fourth generation began with the fact that the Japanese company Busicom (now no longer exists) ordered Intel Corporation to manufacture 12 microcircuits for use in calculators of various models. The small volume of each batch of microcircuits increased the cost of their development. However, the developers managed to create such a device – a microprocessor that could be used in all microcalculators. Its clock frequency is about 0.75 MHz. The processor was four-bit, that is, it allowed to encode all numbers and special characters, which was enough for a calculator.

However, computers work not only with numbers, but also with text. It would take an 8-bit processor to encode all numbers, letters, and special characters. It appeared in 1972 and was called the Intel-8008, and in 1974 the Intel-8080 processor appeared. It was made using NMOS technology (N-cannel Metal Oxide Semiconductor), its clock frequency was 2 MHz, while division of numbers was implemented in the microprocessor itself.

Thus, the history of the development of electronics came to the creation of personal computers (PCs). In the second half of the 70s. there was a need for computers for one workplace. The first such PCs were based on 8-bit processors – Intel-8080 and processors from Zilog Corporation – Z80. The OS for them was developed by Digital Research CP / M (English Control Program for Microcomputers).

The creators of the first PC were two young American technicians: Steve Jobs, who worked at Atari, and Steve Wozniak from Hewlett Packard. In the summer of 1976, in the garage of Jobs’ parents, they built the first PC and called it “Apple-I” – “apple”. In order to get the necessary parts, Jobs had to sell his Volkswagen car. The Apple-I didn’t have a keyboard or case.

In April 1977, they designed another PC – Apple-II (at the same time the famous Apple logo appeared – a bitten multi-colored apple), it had a single-board design and an expansion bus that allowed additional devices to be connected. The keyboard was placed in a separate case. A reliable 8-bit MOS 6502 was taken as the central processor. The memory was only 8 KB, but to increase it, a tape was used, launched from a conventional cassette recorder. Subsequently, graphics video adapters, a disk OS for controlling the OP, and lower case for characters that could be placed on the screen in 80 columns were developed for the Apple II.

In less than 10 years, this PC from Apple (founded in 1976) has conquered the market – more than 2 million copies have been sold. Its price fluctuated around $1,000. It owes its commercial success to a large extent to its open architecture and modular system, which allows the system to be expanded by adding new devices.

By 1980, the success of the PC idea was evident. Their market has reached several tens of thousands per year. The largest electronic corporation in the US, IBM, the leader in the production of computers, has already made one strategic mistake, losing the mini-computer market to Digital Equipment Corporation (DEC). Another cause for concern was the success of computers from Apple Computer. And IBM decides to quickly capture the computer market. There was no doubt that for this you need to create a new PC model. For this, a new processor was needed (instead of the outdated MOS 6502 or Zilog Z80) – the Intel-8088 processor became it.

In 1976, Intel began developing the Intel-8086 microprocessor, which was released in 1978. The size of its registers was doubled, which made it possible to increase performance 10 times compared to the 8080. In addition, the size of the address bus was increased to 16 bits, which was ahead of its time – it additionally needed a 16-bit chip.

In 1979, a new microprocessor was released – Intel-8088, which did not differ from its predecessor, but it had an 8-bit data bus – this allowed the use of 8-bit microcircuits that were popular at that time. Initially, the processor worked at 4.77 MHz, but subsequently other companies developed 8- and 10-MHz processors compatible with it.

On August 12, 1981, IBM first introduced its PC, which was called the IBM PC (English Personal Computer). It had an Intel-8088 processor, two 160 KB floppy disk drives and a 64 KB RAM (RAM) expandable up to 512 KB. The BASIC programming language was placed in the ROM (Read Only Memory) of the PC. IBM developed its own display, which had good contrast, characters were easy to read, and did not tire the eyes with flicker.

By 1982, the incredible popularity of the new computer led to the creation of numerous analogues. By 1984, more than 50 companies were producing IBM-compatible computers, and in 1986 clone sales exceeded IBM’s own sales. The architecture of the IBM PC has conquered the whole world: no other company, be it Apple Macintosh, NeXT, Amiga or others, has been able to take a place next to IBM. Although Apple and Amiga computers were also very popular.

In 1983, IBM released a new model PC XT (eng. eXtended Technology) with a hard disk (hard drive) with a capacity of 10 MB and 640 KB of RAM. Worked PC running MS DOS company Microsoft – now the largest software manufacturer.

The presentation of the new PC – IBM PC AT (Eng. Advanced Technology) – took place in 1984. AT was built on the basis of a new microprocessor – Intel-80286, which was introduced in 1982. The microprocessor had a 16-bit data bus and 16-bit internal registers. The first Intel-80286 ran at a frequency of 6 MHz, later increased to 20 MHz. In general, AT was 5 times more productive than XT. The main advantage of the Intel-80286 was the ability to work with additional memory. It had a 24-bit address bus, which made it possible to work with OPs up to 16 MB. Intel-80286 could work with virtual memory up to 1 GB in size.

Meanwhile, in January 1984, the presentation of the first Macintosh computer from Apple Computer took place. These computers played a significant role in the development of the PC. It had a 9-inch high-definition monitor and took up little space on the desktop, the number of connecting cables in the system was minimal. Motorola’s 68000 microprocessor was used as the central processing unit, subsequent models used the Motorola 68030 microprocessor, and in some they were used in conjunction with a math coprocessor, as well as a color monitor. These PCs were very handy for home work.

In 1985, Intel announced the first 32-bit Intel-80386 processor (Intel-80386DX). He had all the positive qualities of his predecessors. The entire Intel-80286 instruction set is fully compatible with the 386 instruction set. The new processor was fully 32-bit and ran at 16 MHz (later there were models with 25, 33 and 40 MHz). With the increase in the data bus to 32 bits, the number of address lines was also increased to 32, which allowed the microprocessor to access directly 4 GB of physical memory or 64 TB (1 Terabyte = 1024 GB) of virtual memory. To maintain compatibility with Intel-8086, the processor worked in Protect mode, Real mode was also supported, the main difference was the ability to switch from one operating mode to another without restarting the computer. There was also a new mode – virtual (English Virtual mode) – allowing the microprocessor to work in the same way as an unlimited number of Intel-8086. This made it possible for the processor to execute several programs at once.

In 1988, Intel developed the Intel-80386SX microprocessor, which in general was no different from the Intel-80386DX, but it was cheaper and used a 16-bit external data bus.

The first personal computer based on the Intel-80386 was manufactured by Compaq Computers. In April 1987, IBM announced the creation of the PS / 2 family with the MCA (English MicroChannel Architecture) bus. Prior to this, PC AT computers used the ISA (Industry Standard Architecture) bus. It was 32-bit and had a frequency of 10 MHz. In 1989, nine clone maker companies (AST, Epson, HewlettPackard, NEC, Olivetti, Tandy, Wyse, and Zenith) developed the EISA (Extended Industry Standard Architecture) bus. She, like MCA, had a 32-bit capacity, but unlike her, EISA was fully compatible with ISA.

In 1989, a new development by Intel appeared – the Intel-80486 microprocessor (Intel-80486DX). This processor was fully compatible with the Intel-80×86 family, contained a math coprocessor and 8 KB cache. 80486 was more advanced than the Intel-80386 microprocessor, its clock frequency was 33 MHz.

In 1991, Intel introduced the Intel-80486SX processor, which lacked a math coprocessor. And in 1992 – the Intel-80486DX2 processor, which worked with a double clock frequency – 66 MHz. Subsequently, DX4 processors came out with a clock speed of 75 and 100 MHz.

In addition to Intel, other companies began to produce 486 processors, for example, AMD (Advanced Micro Devices) and Cyrix.

These firms made some improvements to them and sold at a price of $ 100. Soon, the VL-Bus, developed by the Video Electronics Standard Association (VESA), became the standard for 486 systems. The throughput was 132 MB/s.

The creation of computers based on processors of the Intel-80486 family made it possible to run numerous software.

The second place after IBM PC is occupied by Apple Computer with PC Macintosh. Computers were produced on the basis of Motorola processors. These computers are very convenient for use at home, in the office and for teaching at school. The latest models – LC 475, LC 575 and LC 630 – based on Motorola 68LC040 processors, are equipped with a CD-ROM drive.

The most productive Macintosh computers of the Quadra series were equipped with a 68040 processor with a clock speed of up to 33 MHz, a coprocessor, and had the ability to expand the RAM up to 256 MB. Quadra was mainly used in printing and advertising, as well as in the creation of multimedia applications and other tasks that require large computing power and processing of significant amounts of data; they are also suitable for creating software. Since 1993, computers of the AV subfamily have been produced, which had standard video inputs and video outputs, which made it possible to display information both on a standard display screen and on a regular TV screen.

In addition to the above models, Apple Computer produces portable computers of the PowerBook series. The computers of the Performa family won the greatest popularity, which were equipped with a fax modem, which was convenient for home work.

In 1993, Intel began commercial production of a new processor – Intel Pentium (Intel did not give it the number 80586).

Be First to Comment

Leave a Reply

Your email address will not be published.