THE BELL

There are those who read this news before you.
Subscribe to get the latest articles.
Email
Name
Surname
How would you like to read The Bell
No spam

Did you know, what is a thought experiment, gedanken experiment?
It is a non-existent practice, an otherworldly experience, the imagination of what is not really there. Thought experiments are like daydreams. They give birth to monsters. Unlike a physical experiment, which is an experimental test of hypotheses, a "thought experiment" magically replaces experimental verification desired, untested conclusions, manipulating logical constructions that actually violate logic itself by using unproven premises as proven ones, that is, by substitution. Thus, the main task of the applicants of "thought experiments" is to deceive the listener or reader by replacing a real physical experiment with his "doll" - fictitious reasoning on parole without physical verification itself.
Filling physics with imaginary, "thought experiments" has led to an absurd, surreal, confusing picture of the world. A real researcher must distinguish such "wrappers" from real values.

Relativists and positivists argue that the "thought experiment" is a very useful tool for testing theories (also arising in our minds) for consistency. In this they deceive people, since any verification can only be carried out by a source independent of the object of verification. The applicant of the hypothesis himself cannot be a test of his own statement, since the reason for this statement itself is the absence of contradictions visible to the applicant in the statement.

We see this in the example of SRT and GR, which have turned into a kind of religion that governs science and public opinion. No amount of facts that contradict them can overcome Einstein's formula: "If the fact does not correspond to the theory, change the fact" (In another version, "Does the fact not correspond to the theory? - So much the worse for the fact").

The maximum that a "thought experiment" can claim is only the internal consistency of the hypothesis within the framework of the applicant's own, often by no means true, logic. Compliance with practice does not check this. A real test can only take place in a real physical experiment.

An experiment is an experiment, because it is not a refinement of thought, but a test of thought. Thought that is consistent within itself cannot test itself. This has been proven by Kurt Gödel.


The need for devices to speed up the counting process appeared in humans thousands of years ago. At that time, the simplest means, such as counting sticks, were used for this. Later came the abacus, better known to us as the abacus. It allowed to perform only the simplest arithmetic operations. A lot has changed since then. Almost every house has a computer, and a smartphone is in your pocket. All this can be combined under common name"Computer technology" or "Computer technology". In this article, you will learn a little more about the history of its development.

1623. Wilhelm Schickard thinks: "Why shouldn't I invent the first adding machine?" And he invents it. He gets a mechanical device capable of performing basic arithmetic operations (addition, multiplication, division and subtraction) and working with the help of gears and cylinders.

1703. Gottfried Wilhelm Leibniz describes the binary number system in his treatise "Explication de l'Arithmtique Binaire", which translates into Russian as "Explanation of Binary Arithmetic". The implementation of computers using it is much simpler, and Leibniz himself knew about it. Back in 1679, he created a blueprint for a binary computer. But in practice, the first such device appeared only in the middle of the 20th century.

1804 Perforated cards (punched cards) appear for the first time. Their use did not stop in the 1970s. They are sheets of thin cardboard, in some places there are holes. Information was recorded in various sequences of these holes.

1820 Charles Xavier Thomas (yes, almost like Professor X) releases the Thomas adding machine, which went down in history as the first mass-produced arithmometer.

1835 Charles Babbage wants to invent his own Analytical Engine and describes it. Initially, the task of the device was to be the calculation of logarithmic tables with high accuracy, but later Babbage changed his mind. Now his dream has become a general purpose machine. At that time, the creation of such a device was quite realistic, but working with Babbage turned out to be difficult because of his nature. As a result of disagreements, the project was closed.

1845 Israel Staffel creates the first ever device capable of extracting square roots from numbers.

1905 Percy Ludgert publishes a project for a programmable mechanical computer.

1936 Konrad Zuse decides to create his own computer. He calls him Z1.

1941 Konrad Zuse releases the Z3, the world's first program-controlled computer. Subsequently, several dozen more devices of the Z series were released.

1961 Launch of the ANITA Mark VII, the world's first fully electronic calculator.

A few words about generations of computers.

1 generation. These are the so-called lamp computers. They work with electronic lamps. The first such device was created in the middle of the 20th century.

2 generation. Everyone used computers of the 1st generation, until suddenly, in 1947, Walter Brattain and John Bardeen invented a very important thing - the transistor. This is how the second generation of computers appeared. They consumed much less energy and their performance was greater. These devices were common in the 50s-60s of the XX century, until the integrated circuit was invented in 1958.

3rd generation. The operation of these computers was based on integrated circuits. Each such circuit contains hundreds of millions of transistors. However, the creation of the third generation did not stop the release of second generation computers.

4th generation. In 1969 Tad Hoff came up with the idea to replace many integrated circuits with one small device. It was later called a microchip. Thanks to this, it became possible to create very small microcomputers. The first such device was released by Intel. And in the 80s, microprocessors and microcomputers were the most common. We still use them.

It was a brief history of the development of computer technology and computer science. I hope I managed to interest you. Goodbye!

Early counting aids and devices

Mankind learned to use the simplest counting devices thousands of years ago. The most demanded was the need to determine the number of items used in barter. One of the simplest solutions was to use the weight equivalent of the exchanged item, which did not require an exact recalculation of the number of its components. For these purposes, the simplest balancing scales were used, which thus became one of the first devices for the quantitative determination of mass.

The principle of equivalence was widely used in another, familiar to many, the simplest counting devices Abacus or Abacus. The number of objects counted corresponded to the number of moved knuckles of this instrument.

A relatively complex device for counting could be a rosary used in the practice of many religions. The believer, as on the accounts, counted the number of prayers uttered on the beads of the rosary, and when passing a full circle of the rosary, he moved special grains-counters on a separate tail, indicating the number of counted circles.

With the invention of gears, much more complex calculation devices appeared. Antikythera mechanism, discovered at the beginning of the 20th century, which was found at the wreck of an ancient ship that sank around 65 BC. e. (according to other sources in or even 87 BC), even knew how to model the movement of the planets. Presumably it was used for calendar calculations for religious purposes, predicting solar and lunar eclipses, determining the time of sowing and harvesting, etc. The calculations were performed by connecting more than 30 bronze wheels and several dials; to calculate the lunar phases, differential transmission was used, the invention of which the researchers for a long time attributed no earlier than the 16th century. However, with the departure of antiquity, the skills of creating such devices were forgotten; it took about one and a half thousand years for people to learn how to create mechanisms similar in complexity again.

The Counting Clock by Wilhelm Schickard

This was followed by the machines of Blaise Pascal ("Pascaline", 1642) and Gottfried Wilhelm Leibniz.

ANITA Mark VIII, 1961

In the Soviet Union at that time, the most famous and widespread calculator was the Felix mechanical adding machine, produced from 1929 to 1978 at factories in Kursk (Schetmash plant), Penza and Moscow.

The advent of analog computers in the prewar years

Main article: History of analog computers

Differential analyzer, Cambridge, 1938

The first electromechanical digital computers

Z-series by Konrad Zuse

Reproduction of the Zuse Z1 computer at the Technique Museum, Berlin

Zuse and his company built other computers, each of which began with a capital letter Z. The most famous machines were the Z11, which was sold to the optical industry and universities, and the Z22, the first computer with magnetic memory.

British Colossus

In October 1947, the directors of Lyons & Company, a British company that owns a chain of shops and restaurants, decided to take an active part in the development of commercial computer development. The LEO I computer began operating in 1951 and was the first in the world to be used regularly for routine office work.

The University of Manchester machine became the prototype for the Ferranti Mark I. The first such machine was delivered to the university in February 1951, and at least nine others were sold between 1951 and 1957.

The second-generation IBM 1401 computer, produced in the early 1960s, occupied about a third of the world computer market, more than 10,000 of these machines were sold.

The use of semiconductors has made it possible to improve not only the central processing unit, but also peripheral devices. The second generation of data storage devices already made it possible to store tens of millions of characters and numbers. There was a division into rigidly fixed ( fixed) storage devices connected to the processor by a high-speed data transfer channel, and removable ( removable) devices. Replacing a disc cassette in a changer only took a few seconds. Although the capacity of removable media was usually lower, but their interchangeability made it possible to store an almost unlimited amount of data. Tape was commonly used for archiving data because it provided more storage at a lower cost.

In many second-generation machines, the functions of communicating with peripherals were delegated to specialized coprocessors. For example, while the peripheral processor is reading or punching punched cards, the main processor is performing calculations or program branches. One data bus carries data between memory and processor during the fetch and execution cycle, and typically other data buses serve peripherals. On the PDP-1, a memory access cycle took 5 microseconds; most instructions required 10 microseconds: 5 to fetch the instruction and another 5 to fetch the operand.

The best domestic computer of the 2nd generation is considered to be BESM-6, created in 1966.

1960s onwards: third and subsequent generations

The rapid growth in the use of computers began with the so-called. "3rd generation" computers. This began with the invention of integrated circuits, which were independently made by Nobel laureate Jack Kilby and Robert Noyce. This later led to the invention of the microprocessor by Tad Hoff (Intel).

The advent of microprocessors led to the development of microcomputers - small, inexpensive computers that could be owned small companies or individual people. Microcomputers, the fourth generation of which first appeared in the 1970s, became ubiquitous in the 1980s and beyond. Steve Wozniak, one of the founders of Apple Computer, became known as the developer of the first mass-produced home computer, and later the first personal computer. Computers based on microcomputer architecture, with features added from their larger counterparts, now dominate most market segments.

In the USSR and Russia

1940s

In 1948, under the supervision of Doctor of Physical and Mathematical Sciences S. A. Lebedev, work began in Kyiv on the creation of a MESM (small electronic calculating machine). In October 1951, she entered service.

At the end of 1948, employees of the Energy Institute. Krizhizhanovsky I. S. Bruk and B. I. Rameev receive a copyright certificate on a computer with a common bus, and in 1950-1951. create it. This machine is the first in the world to use semiconductor (cuprox) diodes instead of vacuum tubes. Since 1948, Brook has been working on electronic computers and control using computer technology.

In the late 1950s, the principles of parallelism of calculations were developed (A. I. Kitov and others), on the basis of which one of the fastest computers of that time, the M-100 (for military purposes), was built.

In July 1961, the USSR launched the first semiconductor universal control machine "Dnepr" (before that there were only specialized semiconductor machines). Even before the start of serial production, experiments were carried out with it to control complex technological processes on the

The rapid development of digital computing technology (CT) and the formation of the science of the principles of its construction and design began in the 40s of the XX century, when electronics and microelectronics became the technical base of CT, and achievements in the field of artificial intelligence.

Until that time, for almost 500 years, BT was reduced to the simplest devices for performing arithmetic operations on numbers. The basis of almost all devices invented over 5 centuries was a gear wheel, designed to fix 10 digits of the decimal number system. The world's first sketch drawing of a thirteen-digit decimal adder based on such wheels belongs to Leonardo da Vinci.

The first actually implemented mechanical digital computing device was the "Pascaline" of the great French scientist Blaise Pascal, which was a 6 (or 8) digit device, on gears, designed for summing and subtracting decimal numbers (1642).

30 years after Pascalina, in 1673, Gottfried Wilhelm Leibniz's "arithmetic device" appeared - a twelve-digit decimal device for performing arithmetic operations, including multiplication and division.

At the end of the 18th century, two events took place in France that were of fundamental importance for the further development of digital computing technology. These events include:

 Joseph Jacquard's invention of programmed control of a loom using punched cards;

 development by Gaspard de Prony, a computing technology that divided numerical calculations into three stages: development of a numerical method, drawing up a program for a sequence of arithmetic operations, carrying out the actual calculations by arithmetic operations on numbers in accordance with the compiled program.

These innovations were later used by the Englishman Charles Babbage, who carried out a qualitatively new step in the development of VT tools - transition from manual to automatic execution of calculations according to the compiled program. He developed the project of the Analytical Engine - a mechanical universal digital computer with program management(1830-1846). The machine consisted of five devices: arithmetic (AU); memory (memory); management (CU); input (UVV); output (HC).

It was from such devices that the first computers, which appeared 100 years later, consisted. AC was built on the basis of gears, it was also proposed to implement memory on them (for thousands of 50-bit numbers). Punched cards were used to enter data and programs. Estimated calculation speed - addition and subtraction in 1 second, multiplication and division - in 1 minute. In addition to arithmetic operations, there was a conditional branch instruction.

It should be noted that although individual components of the machine were created, the entire machine could not be created due to its bulkiness. More than 50,000 gear wheels alone would be needed for it. The inventor planned to use a steam engine to power his analytical engine.

In 1870 (a year before Babbage's death), the English mathematician Jevons designed the world's first "logical machine" that made it possible to mechanize the simplest logical conclusions.

The creators of logic machines in pre-revolutionary Russia were Pavel Dmitrievich Khrushchev (1849-1909) and Alexander Nikolaevich Shchukarev (1884-1936), who worked in educational institutions in Ukraine.

Babbage's brilliant idea was implemented by the American scientist Howard Aiken, who in 1944 created the first relay-mechanical computer in the USA. Its main blocks - arithmetic and memory - were performed on gears. If Babbage was far ahead of his time, then Aiken, using all the same gears, technically used outdated solutions when implementing Babbage's idea.

It should be noted that ten years earlier, in 1934, the German student Konrad Zuse, who was working on his graduation project, decided to make a digital computer with program control. This machine was the first in the world to use the binary system. In 1937, the Z1 machine made the first calculations. It was a binary 22-bit floating point with 64 numbers of memory, and worked on a purely mechanical (lever) basis.

In the same 1937, when the world's first mechanical binary machine Z1 started working, John Atanasoff (Bulgarian by birth, who lived in the USA) began the development of a specialized computer, using vacuum tubes (300 tubes) for the first time in the world.

In 1942-43, the Colossus computer was created in England (with the participation of Alan Turing). This machine, consisting of 2,000 vacuum tubes, was intended for decoding radio messages from the German Wehrmacht. Since the works of Zuse and Turing were secret, few knew about them at that time and they did not cause any resonance in the world.

Only in 1946 did information appear about the ENIAC computer (electronic digital integrator and computer), created in the USA by D. Mauchly and P. Eckert, using electronic technology. The machine used 18,000 vacuum tubes and performed about 3,000 operations per second. However, the machine remained decimal, and its memory was only 20 words. Programs were stored outside of RAM.

Almost simultaneously, in 1949-52. scientists from England, the Soviet Union and the USA (Maurice Wilkes, EDSAK computer, 1949; Sergey Lebedev, MESM computer, 1951; Isaac Brook, M1 computer, 1952; John Mauchly and Presper Eckert, John von Neumann COMPUTER "EDVAK", 1952), created a computer with a program stored in memory.

In general, allocate five generations COMPUTER.

First generation (1945-1954 ) characterized by the appearance of technology on electron tubes. This is the era of the formation of computer technology. Most of the machines of the first generation were experimental devices and were built in order to test certain theoretical positions. The weight and size of these computers were such that they often required separate buildings for themselves.

The founders of computer science are considered to be Claude Shannon - the creator of information theory, Alan Turing - a mathematician who developed the theory of programs and algorithms, and John von Neumann - the author of the design of computing devices, which still underlies most computers. In the same years, another new science related to computer science arose - cybernetics - the science of management as one of the main information processes. The founder of cybernetics is the American mathematician Norbert Wiener.

Second generation (1955-1964) transistors were used instead of vacuum tubes, and magnetic cores and magnetic drums, the distant ancestors of modern hard drives, began to be used as memory devices. All this made it possible to drastically reduce the size and cost of computers, which were then first built for sale.

But the main achievements of this era belong to the area of ​​programs. In the second generation, for the first time, what is called today appeared operating system. At the same time, the first high-level languages ​​were developed - Fortran, Algol, Kobol. These two important improvements have greatly simplified and accelerated the writing of programs for computers.

This expanded the scope of computers. Now not only scientists could count on access to computers, since computers have found use in planning and management, and some large firms even began to computerize their bookkeeping, anticipating this process by twenty years.

AT third generation (1965-1974) for the first time, integrated circuits began to be used - entire devices and nodes of tens and hundreds of transistors, made on a single semiconductor crystal (microcircuits). At the same time, semiconductor memory appeared, which is still used in personal computers as operational memory.

During these years, the production of computers acquires an industrial scale. IBM was the first company to implement a series of fully compatible computers from the smallest, the size of a small cabinet (they didn’t make smaller then), to the most powerful and expensive models. The most common in those years was the System / 360 family of IBM, on the basis of which the ES series of computers was developed in the USSR. Back in the early 60s, the first minicomputers appeared - small, low-power computers that were affordable small firms or laboratories. Minicomputers represented the first step towards personal computers, the prototypes of which were released only in the mid-70s.

Meanwhile, the number of elements and connections between them, fitting in one microcircuit, was constantly growing, and in the 70s, integrated circuits already contained thousands of transistors.

In 1971, Intel released the first microprocessor, which was intended for desktop calculators that had just appeared. This invention was destined to produce a real revolution in the next decade. The microprocessor is the main component of the modern personal computer.

At the turn of the 60s and 70s of the twentieth century (1969), the first global computer network ARPA, the prototype of the modern Internet, was born. Also in 1969, the Unix operating system and the C programming language ("C") simultaneously appeared, which had a huge impact on the software world and still retain their leading position.

Fourth generation (1975 - 1985) characterized by fewer fundamental innovations in computer science. Progress is mainly along the path of development of what has already been invented and invented, primarily by increasing the power and miniaturization of the element base and the computers themselves.

The most important innovation of the fourth generation is the appearance of personal computers in the early 80s. Thanks to personal computers, computing technology is becoming truly mass and generally accessible. Despite the fact that personal and minicomputers still lag behind large machines in computing power, the lion's share of innovations, such as a graphical user interface, new peripherals, global networks, is associated with the emergence and development of this particular technology.

Large computers and supercomputers, of course, continue to evolve. But now they no longer dominate the computer arena as they once did.

Some characteristics of computer technology of four generations are given in Table. 1.1.

Table 1.1

Generations of computing

Generation

main element

Email lamp

Transistor

Integrated circuit

Large integrated circuit (microprocessor)

Number of computers

in the world (pcs.)

Tens of thousands

Millions

Computer dimensions

Significantly less

microcomputer

Performance (conditional) operations / sec

Multiple units

A few dozens

Several thousand

Several tens of thousands

Information carrier

Card,

Perforated tape

Magnetic

Fifth generation (1986 to date) largely determined by the results of the work of the Japanese Committee for Scientific Research in the field of computers, published in 1981. According to this project, computers and computing systems of the fifth generation, in addition to high performance and reliability at a lower cost with the help of the latest technologies, must satisfy the following qualitatively new functional requirements:

 to ensure ease of use of computers through the implementation of information input / output systems by voice, as well as interactive processing of information using natural languages;

 provide the possibility of learning, associative constructions and logical conclusions;

 Simplify the process of creating software tools by automating the synthesis of programs according to the specifications of the initial requirements in natural languages;

 to improve the basic characteristics and operational qualities of computer technology to meet various social problems, to improve the ratio of costs and results, speed, lightness, compactness of computers;

 provide a variety of computing technology, high adaptability to applications and reliability in operation.

Currently, intensive work is underway to create optoelectronic computers with massive parallelism and neural structure, which are a distributed network of a large number (tens of thousands) of simple microprocessors that simulate the architecture of neural biological systems.






























































































































































Back forward

Attention! The slide preview is for informational purposes only and may not represent the full extent of the presentation. If you are interested in this work, please download the full version.

The purpose of the lesson:

  1. to acquaint with the history of the development of computer technology, with devices that are the forerunners of computers and their inventors
  2. give an idea of ​​the connection between the development of computers and the development of human society,
  3. To acquaint with the main features of computers of different generations.
  4. Development of cognitive interest, the ability to use additional literature

Lesson type: learning new material

View: lesson-lecture

Software and didactic support: PC, presentation slides depicting the main devices, portraits of inventors and scientists.

Lesson plan:

  1. Organizing time
  2. Updating new knowledge
  3. The prehistory of computers
  4. Generations of computers (computers)
  5. The Future of Computers
  6. Consolidation of new knowledge
  7. Summing up the lesson
  8. Homework

1. Organizational moment

Stage task: Prepare students for work in the lesson. (Check the readiness of the class for the lesson, the availability of school supplies, attendance)

2. Actualization of new knowledge

Stage task: Preparing students for the active assimilation of new knowledge, to ensure the motivation and acceptance by students of the goal of educational and cognitive activity. Setting lesson goals.

Hello! What technical inventions do you think have particularly changed the way people work?

(Students express their opinions on this issue, the teacher corrects them if necessary)

- You are right, indeed, the main technical device that influenced human labor is the invention of computers - electronic computers. Today in the lesson, we will learn what computing devices preceded the appearance of computers, how computers themselves changed, the sequence of the formation of a computer, when a machine designed simply for counting became a complex technical device. The topic of our lesson: “History of computer technology. generations of computers. The purpose of our lesson : get acquainted with the history of the development of computer technology, with devices that are the predecessors of computers and their inventors, get acquainted with the main features of computers of different generations.

In the lesson, we will work with the help of a multimedia presentation, consisting of 4 sections "Prehistory of computers", "Generations of computers", "Gallery of scientists", "Computer dictionary". In each section there is a subsection "Test yourself" - this is a test in which you will immediately find out the result.

3. The history of computers

To draw students' attention to the fact that a computer is an electronic computer, another name "computer" or "computer" comes from the English verb "compute" - to calculate, therefore the word "computer" can be translated as "computer". That is, in the word computer and in the word computer, the main meaning is calculations. Although we are well aware that modern computers allow not only to calculate, but also to create and process texts, pictures, video, sound. Let's take a look at history...

(at the same time we draw up the table “Prehistory of computers” in a notebook)

"The Prehistory of Computers"

The ancient man mastered the account earlier than writing. Man chose his fingers as the first assistant in counting. It was the presence of ten fingers that formed the basis of the decimal number system. In different countries they speak and write different languages, but they are considered the same. In the 5th century BC. the Greeks and Egyptians used for counting - ABAK - a device similar to Russian abacus.

Abacus is a Greek word and translates as a counting board. The idea of ​​its device is the presence of a special computational field, where, according to certain rules, counting elements are moved. Indeed, originally the abacus was a board covered with dust or sand. It was possible to draw lines on it and shift the pebbles. In ancient Greece, the abacus served primarily for money transactions. On the left side, large monetary units were counted, and on the right, a trifle. The account was kept in binary-quintuple number system. On such a board, it was easy to add and subtract, adding or removing pebbles and transferring them from category to category.

Coming in Ancient Rome abacus, changed outwardly. The Romans began to make it from bronze, ivory or colored glass. There were two rows of slots on the board, along which the bones could be moved. The abacus turned into a real counting device, allowing even fractions to be represented, and was much more convenient than the Greek one. The Romans called this device calculare - "pebbles". From here came the Latin verb calculare - "calculate", and from it - the Russian word "calculator".

After the fall of the Roman Empire, there was a decline in science and culture and the abacus was closed for a while. It revived and spread throughout Europe only in the X century. Abacus was used by merchants, money changers, artisans. Even after six centuries, the abacus remained essential tool to perform calculations.

Naturally, over such a long period of time, the abacus changed its appearance and in XLL-XLLLcc. it took the form of the so-called account on the lines, and between them. This form of account in some European countries remained until the end of the XVLLL century. and only then finally gave way to calculations on paper.

In China, the abacus has been known since the 5th century BC. Counting sticks were laid out on a special board. Gradually, they were replaced by multi-colored chips, and in the 5th century, Chinese abacus appeared - suan-pan. They were a frame with two rows of bones strung on twigs. There were seven of them on each twig. From China, suan-pan came to Japan. It happened in the XVL century and the device was called "soroban".

In Russia, abacus appeared at the same time as in Japan. But Russian abacus was invented independently, as evidenced by the following factors. First, Russian abacus is very different from Chinese. Secondly, this invention has its own history.

In Russia, "counting with bones" was widespread. It was close to the European account on the lines, but the scribes used fruit stones instead of tokens. In XVL, a plank account arose, the first version of Russian accounts. Such accounts are now kept in the Historical Museum in Moscow.

Abacus has been in use in Russia for almost 300 years and has only been replaced by cheap pocket calculators.

The world's first automatic device that could perform addition was created on the basis of a mechanical clock, and was developed in 1623 by Wilhelm Schickard, a professor at the Department of Oriental Languages ​​at a German university. But Blaise Pascal, Godfried Leibniz and Charles Babbage certainly made an invaluable contribution to the development of devices that help to perform calculations.

In 1642, one of the greatest scientists in the history of mankind - the French mathematician, physicist, philosopher and theologian Blaise Pascal invented and manufactured a mechanical device for adding and subtracting numbers - the ARITHMOMETER. ? What material do you think the first adding machine in history was made of? (wood).

The main idea for the design of the future machine was formed - automatic discharge transfer. “Each wheel ... of a certain category, making a movement of ten arithmetic digits, causes the next one to move only one digit” - this formula of invention asserted Blaise Pascal's priority in the invention and secured his right to produce and sell cars.

Pascal's machine carried out the addition of numbers on special disks - wheels. The decimal digits of a five-digit number were set by turning the disks on which digital divisions were applied. The result was read in the windows. The disks had one elongated tooth so that the transfer to the next discharge could be taken into account.

The initial numbers were set by turning the set wheels, the rotation of the handle set in motion various gears and rollers, as a result, special wheels with numbers showed the result of addition or subtraction.

Pascal was one of the greatest geniuses of mankind. He was a mathematician, physicist, mechanic, inventor, writer. The theorems of mathematics and the laws of physics bear his name. In physics, the unit of pressure is Pascal. In computer science, one of the most popular programming languages ​​bears his name.

In 1673, the German mathematician and philosopher Gottfried Wilhelm Leibniz invented and manufactured an adding machine that could not only add and subtract numbers, but also multiply and divide. The poverty and primitiveness of the first computing devices did not prevent Pascal and Leibniz from expressing a series interesting ideas about the role of computing technology in the future. Leibniz wrote about machines that would work not only with numbers, but also with words, concepts, formulas, and could perform logical operations. This idea seemed absurd to most of Leibniz's contemporaries. In the 18th century, Leibniz's views were ridiculed by the great English satirist J. Swift, the author of the famous novel Gulliver's Travels.

Only in the 20th century did the significance of the ideas of Pascal and Leibniz become clear.

Along with computing devices, mechanisms for AUTOMATIC WORK ACCORDING TO A SET PROGRAM (jukeboxes, striking clocks, Jacquard looms) also developed.

At the beginning of the 19th century, the English mathematician Charles Babbage, who was engaged in compiling tables for navigation, developed a PROJECT of a computing "analytical" machine, which was based on the PRINCIPLE OF PROGRAM CONTROL (PPU). Babbage's innovative thought was picked up and developed by his student Ada Lovelace, daughter of the poet George Byron - who became the first programmer in the world. However, the practical implementation of Babbage's project was impossible due to the insufficient development of industry and technology.

The main elements of the Babbage machine, inherent in a modern computer:

  1. Warehouse is a device where initial numbers and intermediate results are stored. In a modern computer, this is memory.
  2. Factory - an arithmetic device in which operations are performed on numbers taken from the Warehouse. In a modern computer, this is the Processor.
  3. Initial data input blocks - input device.
  4. Print results - output device.

The architecture of the machine practically corresponds to the architecture of modern computers, and the instructions that the Analytical Engine executed basically include all the instructions of the processor.

An interesting historical fact is that the first program for the analytical engine was written by Ada Augusta Lovelace, the daughter of the great English poet George Byron. It was Babbage who infected her with the idea of ​​creating a computer.

The idea of ​​programming mechanical devices using a punched card was first implemented in 1804 in a loom. For the first time they were used by designers of looms. The London weaver Joseph Marie Jacquard succeeded in this. In 1801 he created an automatic loom controlled by punched cards.

The thread was raised or lowered with each shuttle stroke, depending on whether there was a hole or not. The transverse thread could bypass each longitudinal one and the other side, depending on the program on the punched card, thereby creating an intricate pattern of interlaced threads. This weaving is called "jacquard" and is considered one of the most complex and intricate weaves. This programmed loom was the first mass-produced industrial device and is considered one of the most advanced machines ever made by man.

The idea of ​​writing a program on a punched card came up with the first programmer, Ada Augusta Lovelace. It was she who proposed the use of perforated cards in Babbage's analytical engine. In particular, in one of her letters she wrote: “The Analytical Engine weaves algebraic patterns in the same way as a loom reproduces colors and leaves.”

Herman Hollerith also used punched cards in his machine to record and process information. Punched cards were also used in the first generation of computers.

Until the 40s of the twentieth century, computing technology was represented by adding machines, which from mechanical became electric, where electromagnetic relays spent several seconds multiplying numbers, which worked exactly on the same principles as Pascal and Leibniz adding machines. In addition, they were very unreliable, often breaking down. It is interesting that once the reason for the breakdown of an electric adding machine was a moth stuck in a relay, in English “moth, beetle” - bug, hence the concept of “bug” appeared as a malfunction in a computer.

Herman Hollerith Born February 29, 1860 in the American city of Buffalo in a family of German immigrants. Herman was good at mathematics and science, and at the age of 15 he entered the School of Mines at Columbia University. A talented young man was noticed by a professor of the same university and invited him after graduation to the national census bureau headed by him. A census was taken every ten years. The population was constantly growing, and its number in the United States at that time was about 50 million people. It was practically impossible to fill out a card for each person manually, and then calculate and process the results. This process dragged on for several years, almost until the next census. It was necessary to find a way out of this situation. Herman Hollerith was inspired by Dr. John Billings, head of the Composite Data Department, to mechanize this process. He suggested using punched cards to record information. Hollerith named his car tabulator and in 1887 year it was tested in Baltimore. The results were positive, and the experiment was repeated in St. Louis. The gain in time was almost tenfold. The US government immediately entered into a contract with Hollerith for the supply of tabulators, and already in 1890 the population census was carried out using machines. Processing the results took less than two years and saved $5 million. The Hollerith system not only provided high speed, but also allowed us to compare statistical data on a variety of parameters. Hollerith developed a convenient keyboard puncher that allows punching about 100 holes per minute simultaneously on several cards, automated the procedures for feeding and sorting punched cards. Sorting was carried out by a device in the form of a set of boxes with lids. Punched cards moved along a kind of conveyor. On one side of the card were reading pins on springs, on the other - a reservoir of mercury. When the pin fell into the hole on the punched card, thanks to the mercury on the other side, it closed the electrical circuit. The lid of the corresponding box opened and a punched card fell into it. The tabulator has been used for population censuses in several countries.

In 1896, Herma Hollerith started the Tabulating Machine Company (TMC) and his machines were used everywhere - and on large industrial enterprises and in conventional firms. And in 1900 the tabulator was used for the census. renames the company to IBM (International Business Machines).

4. Generations of computers (computers)

(in parallel, we draw up entries in a notebook and a table “Generations of computers (computers)”)

COMPUTER GENERATIONS
period Element base Quick-action (ops/sec.) Information carriers programs application Computer examples
I
II
III
IV
V

Icomputer generation: In the 30s of the 20th century, a breakthrough, a radical revolution, occurred in the development of physics. AT computers began to be used no longer wheels, rollers and relays, but vacuum electron tubes. The transition from electromechanical elements to electronic immediately increased the speed of machines hundreds of times. The first operating computer was built in the USA in 1945 at the University of Pennsylvania by scientists Eckert and Mauchly and was called ENIAC. This machine was built by order of the US Department of Defense for air defense systems, for automation of control. To correctly calculate the trajectory and speed of the projectile to hit an air target, it was necessary to solve a system of 6 differential equations. This problem was to be solved by the first computer. The first computer occupied two floors of one building, weighed 30 tons and consisted of tens of thousands of electronic tubes, which were connected by wires, the total length of which was 10 thousand km. When the ENIAC computer was running, the electricity in the town went out, so much electricity was consumed by this machine, the vacuum tubes quickly overheated and failed. A whole group of students was engaged only in the fact that they constantly searched for and replaced burned-out lamps.

In the USSR, the founder of computer technology was Sergey Alekseevich Lebedev, who created the MESM (small calculating machine) in 1951 (Kyiv) and BESM (high-speed ESM) - 1952, Moscow.

IIgeneration: In 1948, the American scientist Walter Brightten invented the TRANSISTOR, a semiconductor device that replaced radio tubes. The transistor was much smaller than a vacuum tube, was more reliable and used much less electricity, it alone replaced 40 vacuum tubes! Computers have become smaller and much cheaper, their speed has reached several hundred operations per second. Now computers were the size of a refrigerator, they could be purchased and used by scientific and technical institutes. At that time, the USSR kept pace with the times and produced world-class computers BESM-6.

IIIgeneration: The second half of the 20th century is characterized by the rapid development of science and technology, especially semiconductor physics, and since 1964, transistors have been placed on microcircuits made on crystal surfaces. This made it possible to overcome the millionth barrier in speed.

IVgeneration: Since 1980, scientists have learned how to place several integrated circuits on one chip, the development of microelectronics has led to the creation of microprocessors. The IC crystal is smaller and thinner than a contact lens. The speed of modern computers is hundreds of millions of operations per second.

In 1977, the first PC (personal computer) from Apple Macintosh appeared. Since 1981, IBM (International Business Machine) has become the leader in the production of PCs; this company has been operating in the US market since the 19th century and produced various devices for offices - abacus, arithmometers, pens, etc. and has established itself as a reliable company, which was trusted by the majority business people in the USA. But that's not the only reason why IBM PCs were so much more popular than Apple Macintosh PCs. Apple Macintosh PCs were a “black box” for the user - he could not disassemble, upgrade the PC, attach new devices to the PC, and the IBM PCs were open to the user and thus allowed the PC to be assembled as a children's designer, so most users chose the IBM PC. Although when we use the word “computer” we represent a PC, but there are tasks that even modern PCs cannot solve, which only supercomputers can handle, the speed of which is estimated at billions of operations per second.

Lebedev's scientific school successfully competed with the leading US firm IBM in its results. Among the scientists of the world, contemporaries of Lebedev, there is no person who, like him, would have such a powerful creativity in order to cover the period from the creation of the first tube computers to the ultra-high-speed supercomputer with his scientific activity. When the American scientist Norbert Wiener, who is called the "first cyber prophet", came to the USSR in 1960, he noted "They are quite a bit behind us in equipment, but far ahead of us in the THEORY of automation." Unfortunately, in the 60s, the science of cybernetics was persecuted as "bourgeois pseudoscience", cybernetics scientists were imprisoned, because of which Soviet electronics began to noticeably lag behind foreign ones. Although it became impossible to create new computers, no one could forbid scientists to think. Therefore, until now, our Russian scientists are ahead of the world scientific thought in the field of automation theory.

For the development of computer programs created various languages programming (algorithmic languages). FORTRAN FORTRAN - FORmula TRANslated - the first language, created in 1956 by J. Backus. In 1961, BASIC appeared (Beginners All-purpose Simbolic Instartion Code - a multi-purpose symbolic instruction language for beginners) T. Kurtz, J. Kemeny. In 1971, Professor Nicholas Wirth of the University of Zurich created the Pascal language Pascal, which he named after the scientist Blaise Pascal. Other languages ​​were also created: Ada, Algol, Cobol, C, Prolog, Fred, Logo, Lisp, etc. But Pascal is still the most popular programming language, many later languages ​​have taken the basic commands and principles of building a program from Pascal, for example, C, C + and the Delphi programming system, even BASIC, having changed, borrowed from Pascal its structure and universality. In the 11th grade, we will study the Pascal language and learn how to create programs for solving problems with formulas, for text processing, learn how to draw and create moving pictures.

Supercomputers

5. The future of computers

6. Consolidation of new knowledge

Consolidation of new material is possible with the help of a test in a multimedia presentation for the lesson: the “Test yourself” section in each part of the presentation: “Prehistory of computers”, “Generations of computers”, “Gallery of scientists”.

Testing knowledge on this topic is possible using the tests "History of Computing Technology" ( Attachment 1) in 4 versions and a test about scientists "Informatics in faces" ( Appendix 2)

7. Summing up the lesson

Checking completed tables ( Annex 3)

8. Homework

  • lecture in the presentation notebook, tables "Prehistory of computers", "Generations of computers"
  • prepare a message about the 5th generation of computers (the future of computers)

THE BELL

There are those who read this news before you.
Subscribe to get the latest articles.
Email
Name
Surname
How would you like to read The Bell
No spam