In 1623, Wilhelm Schickard , a professor at Tyubinsk University, described the device of a “clock for counting” . It was the first mechanical machine that could only add and subtract. In our time, according to his description, its model is built.
In 1642, the French mathematician Blaise Pascal (1623-1662) designed a calculating device to facilitate the work of his father, a tax inspector. This device allowed to sum up decimal numbers. Outwardly, it was a box with numerous gears. The counter-registrar, or counting gear, became the basis of the summing machine. She had ten protrusions, each of which was marked with numbers.
To transmit tens, one elongated tooth was located on the gear, which engaged and turned the intermediate gear, which transmitted rotation to the gear of tens. An additional gear was needed so that both counting gears – units and tens – rotated in the same direction. The counting gear was connected to the lever with the help of a ratchet mechanism (transmitting forward movement and not transmitting reverse). The deviation of the lever at one angle or another made it possible to enter single-digit numbers into the counter and sum them up. In Pascal’s machine, a ratchet drive was attached to all the counting gears, which made it possible to sum up multi-digit numbers.
In 1673, the German philosopher, mathematician, physicist Gottfried Wilhelm Leibniz (1646-1716) created a “step calculator” – a calculating machine that allows adding, subtracting, multiplying, dividing, extracting square roots, while using the binary number system. It was a more advanced device that used a moving part (a prototype of the carriage) and a handle with which the operator rotated the wheel. The machine was the prototype of the adding machine used from 1820 until the 1960s.
In 1804, the French inventor Joseph Marie Jacquard (1752-1834) came up with a way to automatically control the thread while working on a loom. The operation of the machine was programmed using a whole deck of punched cards , each of which controlled one shuttle move. Moving on to a new pattern, the operator simply replaced one deck of punched cards with another. The creation of a loom controlled by cards with holes punched into them and connected to each other in the form of a tape is one of the key discoveries that led to the further development of computer technology.
Charles Xavier Thomas (1785-1870) in 1820 created the first mechanical calculator , which could not only add and multiply, but also subtract and divide. The rapid development of mechanical calculators led to the fact that by 1890 a number of useful functions had been added: storing intermediate results with their use in subsequent operations, printing the result, etc. The creation of inexpensive, reliable machines made it possible to use them for commercial purposes and scientific calculations.
In 1822 English mathematician Charles Babbage (1792-1871) put forward the idea of creating a program-controlled calculating machine with an arithmetic device, a control device, input and printing. The first machine designed by Babbage, the Difference Engine , was powered by a steam engine. She calculated tables of logarithms using the method of constant differentiation and recorded the results on a metal plate. A working model he created in 1822 was a six-digit calculator capable of making calculations and printing numerical tables.
Babbage ‘s Analytical Engine was built by enthusiasts from the London Science Museum. It consists of four thousand iron, bronze and steel parts and weighs three tons. True, it is very difficult to use it – with each calculation, you have to turn the knob of the machine several hundred (or even thousands) times. The numbers are written (typed) on disks arranged vertically and set in positions from 0 to 9. The engine is driven by a sequence of punched cards containing instructions (program).
Simultaneously with the English scientist, Lady Ada Lovelace (1815-1852) worked. She developed the first programs for the machine, laid down many ideas and introduced a number of concepts and terms that have survived to this day. Lady Lovelace was the only daughter of George Gordon Byron. She predicted the emergence of modern computers as multifunctional machines not only for computing, but also for working with graphics and sound. In the mid 70s. of our century, the US Department of Defense officially approved the name of the unified programming language of the American armed forces. The language is called Ada. Recently, programmers around the world have their own professional holiday. It is called so – “Programmer’s Day” – and is celebrated on December 10th. Just in time for Ada Lovelace’s birthday.
In 1855, the brothers George and Edward Schutz from Stockholm built the first mechanical computer using the work of C. Babbage. In 1878, the Russian mathematician and mechanic Pafnuty Lvovich Chebyshev created a summing apparatus with continuous transmission of tens, and in 1881, a prefix to it for multiplication and division.
In 1880 Vilgodt Teofilovich Odner , an ethnic Swede who lived in St. Petersburg, designed an adding machine . His arithmometers were distinguished by reliability, average dimensions and ease of use. Odner began working on the adding machine in 1874, and in 1890 he began mass production of adding machines. Their modification “Felix” was produced until the 50s of the XX century.
Early 20th century
1918 Russian scientist M.A. Bonch-Bruevich and the English scientists W. Eccles and F. Jordan (1919) independently created an electronic relay, called by the British a trigger , which played an important role in the development of computer technology.
In 1930 Winniver Bush (1890-1974) constructs the differential analyzer . In fact, this is the first successful attempt to create a computer capable of performing cumbersome scientific calculations. Bush’s role in the history of computer technology is very great, but his name most often comes up in connection with the prophetic article “As We May Think” (1945), in which he describes the concept of hypertext.
In 1937, Harvard mathematician Howard Aiken proposed a project to create a large calculating machine. The work was sponsored by IBM President Thomas Watson, who invested $500,000 in it. The design of the Mark-1 began in 1939, the New York company IBM built this computer. The computer contained about 750 thousand parts, 3304 relays and more than 800 km of wires.
In 1946, John von Neumann proposed a number of new ideas for computer organization, including the concept of a stored program, i.e. storing the program in a storage device. As a result of the implementation of von Neumann’s ideas, a computer architecture was created, which in many ways has been preserved to this day.
In 1947, the Mark-2 calculating machine appeared, which was the first multitasking machine – the presence of several buses made it possible to simultaneously transfer several numbers from one part of the computer to another. December 23, 1947 Bell Telephone Laboratories employees John Bardeen and Walter Bremen demonstrated their invention for the first time, dubbed the transistor . This device, ten years later, opened up completely new possibilities.
In 1948, Academician S.A. Lebedev (1890-1974) and B.I. Rameev proposed the first draft of a domestic digital electronic computer: first, MESM – a small electronic calculating machine (1951, Kiev), then BESM – a high-speed electronic calculating machine (1952, Moscow). In parallel with them, Strela, Ural, Minsk, Hrazdan, Nairi were created.
In 1951, the first production computers Ferranti Mark-1 and LEO-1 appeared in England. And after 5 years, Ferranti released the Pegasus computer, in which the concept of general-purpose registers was first embodied. Jay Forrester patented magnetic core memory . For the first time, such memory was used on the Whirlwind-1 machine. It consisted of two cubes with 32x32x17 cores, which provided the storage of 2048 words for 16-bit binary numbers with one parity bit. In this machine, a universal non-specialized bus was used for the first time (the interconnections between various computer devices become flexible) and two devices were used as input-output systems: a Williams cathode-ray tube and a typewriter with a punched tape (flexowriter).
In 1952 trial operation of the domestic computer BESM-1 began.
In the USSR in 1952-1953, A.A. Lyapunov developed an operator programming method (operator programming), and in 1953-1954, L.V. Kantorovich developed the concept of large-block programming .
In 1955, the first algorithmic language FORTRAN (FORmule TRANslator – formula translator) was released. It was used to solve scientific, technical and engineering problems and was developed by IBM employees under the leadership of John Backus .
In 1958 Jack Kilby of Texas Instruments and Robert Noyce of Fairchild Semiconductor independently invent the integrated circuit .
1959 Under the leadership of S.A. Lebedev created the BESM-2 machine with a capacity of 10 thousand operations / s. Its application is associated with calculations of launches of space rockets and the world’s first artificial satellites of the Earth, and then the M-20 machine – for its time one of the fastest in the world (20 thousand operations / s.).
In 1960, ALGOL (Algoritmic Language – algorithmic language) appeared, focused on scientific applications. It introduced many new concepts, such as block structure. This language has become the conceptual foundation of many programming languages. Thirteen European and American programmers in Paris approved the ALGOL-60 programming language standard.
1963 – the beginning of the production of computers “Minsk-32” with external memory on removable magnetic disks. Second-generation machines appeared, built on a non-semiconductor element base – on magnetic elements. So, at Moscow State University. M.V. Lomonosov by a team led by N.P. Brusentsov, the Setun machine was created (which was mass-produced in 1962-1964).
The “Setun” machine is a small-sized machine made on magnetic elements. This is a fixed-point unicast machine. As a number system, it uses the ternary system with the numbers 0, 1, -1. “Setun” is the first machine in the world to use this number system.
In 1964 Stanford Research Center employee Douglas Engelbart demonstrated the operation of the first manipulator mouse . IBM announced the creation of six models of the IBM 360 (System 360) family, which became the first computers of the third generation. The models had a single command system and differed from each other in the amount of RAM and performance.
In 1967 under the leadership of S.A. Lebedev and V.M. Melnikov , a high-speed computer BESM-6 was created at ITM and VT. It was followed by “Elbrus” – a new type of computer with a capacity of 10 million operations / s.
1968 in the United States, the Burrows company released the first high-speed computer on LSIs (large-scale integrated circuits) – the B2500 and B3500.
In 1968-1970, Professor Niklaus Wirth created the PASCAL language at the Zurich University of Technology, named after Blaise Pascal, the first designer of a device that now belongs to the class of digital computers. PASCAL was created as a language that, on the one hand, would be well suited for teaching programming, and on the other hand, would make it possible to effectively solve a wide variety of tasks on modern computers.
Second half of the 20th century
October 29, 1969 is considered to be the birthday of the Network . On this day, the very first, though not entirely successful, attempt was made to remotely connect to a computer located at the Stanford University Research Center (SRI) from another computer located at the University of California, Los Angeles (UCLA). Distant from each other at a distance of 500 kilometers, SRI and UCLA became the first nodes of the future ARPANet network.
In 1971 Intel (USA) created the first microprocessor (MP) – a programmable logic device manufactured using VLSI technology. The IBM/370 model 145 computer appeared, the first computer to use only integrated circuits in its main memory. The first Poketronic pocket calculator is released.
Dennis Ritchie of Bell Lab’s developed the “C” (C) programming language. So it was called because the previous version was called “B”.
In 1968, work began on the first machine of the EC family in Minsk. 1971 – the beginning of the production of models of the EU series, the EU-1020 (20 thousand op / sec), since the production of Minsks and Penza Urals has ceased since the 70s. Although it must be understood that the focus on IBM systems did not mean mindless copying. It was simply impossible, because, despite some warming of relations with the West, there were no legal ways to get a car and software. Ryad models were developed on the basis of existing publications on the principles of IBM architecture and operating systems. So all EU cars can be considered original developments to some extent and they have all been patented.
1974 Intel develops the first universal eight-bit microprocessor, the 8080 , with 4500 transistors.
In 1975 Gene Amdal developed the fourth generation LSI computer – AMDAL-470 V / 6 . Harry Kildall of Digital Research developed the CP/M operating system . Young programmer Paul Allen and Harvard University student Bill Gates implemented the BASIC language for Altair. Subsequently, they founded the company Microsoft (Microsoft), which is today the largest manufacturer of software.
In 1976 young Americans Steve Jobs and Steve Wozniak organized an enterprise for the manufacture of personal computers “Apple” ( “Yabloko” ), designed for a wide range of non-professional users.
In 1980, the ADA language appeared, named in memory of Ada Lovelace, the first programmer in the history of computing. It was created in France by order of the US Department of Defense as a universal programming language. It includes such features as system programming, parallelism, etc.
1981 IBM releases the first personal computer, the IBM PC , based on the 8088 microprocessor. 1982 Intel releases the 80286 microprocessor .
In 1982, the famous x86 series was launched. The 16-bit Intel 80286 microprocessor based on 134,000 transistors was three times faster than competitors’ models. A distinctive feature of this development was that the principle of software compatibility with next-generation processors was implemented here for the first time due to built-in memory management tools.
In 1982, Peter Norton accidentally erased a file from the hard drive of his personal computer. Restoring the file turned out to be a difficult and painstaking task. However, the current situation led to the fact that Norton created a program that is the prototype of today’s utilities.
1984 Apple Computer introduces the Macintosh , the first model of the later-famous Macintosh family with a user-friendly operating system and advanced graphics capabilities far beyond those of the then standard IBM-compatible MS-DOS PCs. These computers quickly gained millions of fans and became the computing platform for entire industries such as publishing and education.
Sony and Philips are developing the CD-ROM standard, a CD recording standard. MIDI and DNS standards have also been developed. IBM released the IBM PC/AT personal computer.
1985 Intel released a 32-bit microprocessor 80386 , consisting of 250 thousand transistors. Microsoft released the first version of the Windows graphical operating environment. In the same year, a new programming language “C++” appeared.
In 1986, the USSR began producing one of the most popular machines of the SM line, the SM 1810 microcomputer, which could also act as a personal computer. It is worth mentioning those personal computers that were produced by the domestic industry in the mid-80s. According to the level of capabilities, they were divided into domestic and professional. The household class included “Electronics BK-0010” (BK – household computer) produced in Zelenograd, which used a regular TV as a display and provided only 64 Kbytes of RAM. And another development of the Ministry of Electronic Industry, “Electronics-85” , was equipped with a special display and 4 MB of RAM. A car called “Iskra-226” also belonged to the professional class.
The end of the 80s is the end of the era of Soviet computer building. The heyday of domestic schools for the development of computers is already behind us. However, their 40-year history had a worthy, albeit somewhat sad, ending. In 1989, work was completed on the last two Soviet supercomputers – Elektronika SS BIS was put into trial operation and the development of Elbrus 3-1 was completed. Both machines are the fruit of the creative efforts of the largest Soviet engineers, students of Sergei Alekseevich Lebedev.
In 1989, Intel released another chip – 80486 . This is the first processor with more than 1 million transistors. Microsoft released the WORD word processor . The GIF graphic file format has been developed.
In March 1989, Tim Berners-Lee proposed the concept of a new distributed information system, which he called the World Wide Web. Hypertext technology was supposed to make it easy to “jump”: from one document to another. In 1990 these proposals were accepted and the project started. Tim Berners-Lee developed the HTML language (Hypertext Markup Language – hypertext markup language; the main format of Web documents) and the prototype of the World Wide Web.
In 1991 Microsoft has released Windows 3.1 . The JPEG graphic format has been developed.
In 1992 the first free operating system with great features appeared – Linux . Finnish student Linus Torvalds (the author of this system) decided to experiment with the commands of the Intel 386 processor and posted the result on the Internet. Hundreds of programmers from around the world began to add and remake the program. It has evolved into a fully functional working operating system. History is silent about who decided to call it Linux, but how this name appeared is quite clear. “linu” or “lin” from the name of the creator and “x” or “ux” from UNIX, because the new OS was very similar to it, only now it also worked on computers with the x86 architecture.