The principle and development history of computer. Some share it.

an electronic computer (hereinafter referred to as a computer) is a machine that processes data according to a series of instructions. Commonly known as "computer".

there are many kinds of computers. In fact, computers are generally tools for processing information. According to Turing machine theory, a computer with the most basic functions should be able to do anything that other computers can do. Therefore, regardless of time and storage factors, all personal digital assistants (PDA) and supercomputers should be able to complete the same job. That is to say, even computers with the same design should be used for various tasks, from company salary management to unmanned spacecraft control, as long as they are modified accordingly. Due to the rapid progress of science and technology, the next generation of computers can always significantly surpass their predecessors in performance, which is sometimes called "Moore's Law".

computers come in different forms. Early computers were the size of a house, but today some embedded computers may be smaller than a deck of playing cards. Of course, even today, there are still a large number of huge supercomputers serving special scientific computing or transaction processing needs for large organizations. A relatively small computer designed for personal application is called a microcomputer, or microcomputer for short. We usually refer to this when we use the word "computer" in our daily life today. However, the most common application form of computers now is embedded. Embedded computers are usually relatively simple and small in size, and are used to control other devices-whether airplanes, industrial robots or digital cameras.

The above definition of electronic computer includes many special-purpose devices that can calculate or have limited functions. However, when it comes to modern electronic computers, its most important feature is that any electronic computer can simulate the behavior of any other computer as long as it is given the correct instructions (only limited by the storage capacity and execution speed of the electronic computer itself). Accordingly, modern electronic computers are also called general-purpose electronic computers compared with early electronic computers.

History

ENIAC is a milestone in the history of computer development. The original English word "computer" of computer refers to people who are engaged in data calculation. And they often need to use some mechanical computing equipment or analog computers. The ancestors of these early computing devices include abacus and Antikythera mechanism, which can be traced back to 87 BC and was used by the ancient Greeks to calculate the movement of planets. With the re-prosperity of mathematics and engineering in Europe at the end of the Middle Ages, Wilhelm Schickard took the lead in developing the first computing equipment in Europe in 1623.

In p>181, Joseph Marie Jacquard improved the design of loom, in which he used a series of punched paper cards as a program to weave complex patterns. Although Jacquard loom is not considered as a real computer, its appearance is indeed an important step in the development of modern computers.

Charles? Babic was the first person to conceive and design a fully programmable computer in 182. However, due to technical conditions, financial constraints, and unbearable constant tinkering with the design, this computer never came out in his lifetime. By the late 19th century, many technologies that proved to be of great significance to computer science appeared one after another, including punched cards and vacuum tubes. Hermann Hollerith designed a machine for watchmaking, which realized large-scale automatic data processing using punched cards.

in the first half of the 2th century, in order to meet the needs of scientific calculation, many single-purpose and increasingly complex analog computers were developed. These computers are based on the mechanical or electronic models of the specific problems they are aiming at. In the 193 s and 194 s, the performance of computers became stronger and the universality was improved, and the key features of modern computers were constantly added.

Claude? In 1937, Claude Shannon published his great paper "Symbol Analysis in Relays and Switching Circuits", in which the application of digital electronic technology was first mentioned. He showed people how to use switches to realize logical and mathematical operations. Since then, he has further consolidated his ideas by studying Vannevar Bush's differential simulator. This is an important moment that marks the beginning of the design of binary electronic circuits and the application of logic gates. As a pioneer of the birth of these key ideas, it should include: Almon Strowger, who applied for a patent for a device containing logic gates; Nicholas? Nikola Tesla, who applied for circuit equipment with logic gates as early as 1898; Lee De Forest, in 197, he replaced the relay with a vacuum tube.

it's quite difficult to define the so-called "first electronic computer" along such a long journey of searching up and down. On May 12th, 1941, Konrad Zuse completed his electromechanical equipment "Z3", which was the first computer with automatic binary mathematical calculation and feasible programming function, but it was not an "electronic" computer. In addition, other noteworthy achievements mainly include: Atanasoff-Berry computer, which was born in the summer of 1941, is a computer with specific intentions, but it uses a vacuum tube calculator, binary values and reusable memory; The mysterious Colossus computer, which was shown in Britain in 1943, does tell people that using vacuum tubes is trustworthy and can realize electrification reprogramming, although its programming ability is extremely limited. Harvard Mark I; of Harvard University; And the binary-based "ENIAC" (ENIAC, 1944), which is the first computer with general purpose, but its structural design is not flexible enough, so every reprogramming means reconnecting the electrical and physical circuits.

The team that developed Eneike further improved the design according to its defects, and finally presented the Feng? Neumann architecture (program storage architecture). This system is the foundation of all computers today. In the middle and late 194s, a large number of computers based on this system began to be developed, among which Britain was the earliest. Although the first machine developed and put into operation was a "Small-Scale Experimental Machine" (SSEM), the practical machine really developed is probably EDSAC.

throughout the 195s, vacuum tube computers were dominant. In the 196s, transistor computers took its place. Transistors are smaller, faster, cheaper and more reliable, which makes them commercially available. In the 197s, the introduction of integrated circuit technology greatly reduced the production cost of computers, and computers began to move towards thousands of households.

[ editing] principle

the main structure of a personal computer:

monitor

motherboard

CPU (microprocessor)

main storage (memory)

expansion card

power supply

optical drive

secondary storage (hard disk)

keyboard < Neumann architecture. This structure realizes a practical general-purpose computer.

in the stored program structure, a computer is described as four main parts: arithmetic logic unit (ALU), control circuit, memory and input/output device (I/O). These components are connected by a group of flat cables (especially, when a group of wires is used for data transmission with different intentions, it is also called a bus) and driven by a clock (of course, some other events may also drive the control circuit).

conceptually, the memory of a computer can be regarded as a set of "cells". Each "cell" has a number called an address; But also can store a smaller fixed-length information. This information can be either an instruction (telling the computer what to do) or data (the processing object of the instruction). In principle, every "cell" can store either one.

the arithmetic logic unit (ALU) can be called the brain of a computer. It can do two kinds of operations: the first kind is arithmetic operation, such as adding and subtracting two numbers. The function of arithmetic operation unit is very limited in ALU. In fact, some ALUs do not support multiplication and division at circuit level at all (because users can only perform multiplication and division by programming). The second type is comparison operation, that is, given two numbers, ALU compares them to determine which is bigger.

I/O system is a means for computers to receive information from the outside world and feed back the operation results to the outside world. For a standard personal computer, the input devices are mainly keyboard and mouse, while the output devices are monitors, printers and many other I/O devices that can be connected to the computer.

the control system links all parts of the above computer. Its function is to read instructions and data from memory and input/output devices, decode the instructions, deliver correct inputs that meet the requirements of the instructions to the ALU, and tell the ALU what to do with these data and where to return the resulting data. An important component in the control system is a counter to keep track of the address of the current instruction. Usually, this counter is accumulated with the execution of the instruction, but sometimes if the instruction indicates a jump, this rule is not followed.

since 198s, ALU and control unit (which are integrated into central processing unit (CPU)) have been gradually integrated into an integrated circuit, called microprocessor. The working mode of this kind of computer is very intuitive: in a clock cycle, the computer first obtains instructions and data from the memory, then executes the instructions, stores the data, and then obtains the next instruction. This process is repeated until a termination instruction is obtained.

explained by the controller, the instruction set executed by the arithmetic unit is a carefully defined simple instruction set with a very limited number. Generally, it can be divided into four categories: 1) data movement (for example, copying a numerical value from storage unit A to storage unit B)2), digital logic operation (for example, calculating the sum of storage unit A and storage unit B and returning the result to storage unit C)3), conditional verification (for example, if the numerical value in storage unit A is 1, the next instruction address is storage unit F)4), and instruction sequence. For example, 111 is a copy instruction code of Intel x86 microprocessor. The instruction set supported by a computer is the machine language of the computer. Therefore, using popular machine language will make it easier for established software to run on a new computer. Therefore, for those who develop commercial software, they usually only pay attention to one or several different machine languages.

more powerful small computers, large computers and servers may be different from the above computers. They usually share tasks with different CPUs for execution. Today, microprocessors and multi-core personal computers are also developing in this direction.

Supercomputers usually have a significantly different architecture from basic stored program computers. They are usually made up of thousands of CPUs, but these designs seem to be only useful for specific tasks. In all kinds of computers, there are some microcontrollers that use Harvard architecture to separate programs and data.

[ Edit] Digital Circuit Realization of Computer

The physical realization of these conceptual designs mentioned above is varied. As we mentioned earlier, a stored program computer can be either mechanical in babic or based on digital electronics. However, digital circuits can realize arithmetic and logical operations using binary numbers by electronically controlling switches such as relays. Shannon's paper just shows us how to arrange relays to form logic gates that can realize simple Boolean operations. Other scholars quickly pointed out that vacuum tubes can replace relay circuits. Vacuum tubes were originally used as amplifiers in radio circuits, and then they began to be used more and more as fast switches in digital electronic circuits. When one pin of an electron tube is energized, current can flow freely between the other two ends.

We can design and accomplish many complicated tasks through the arrangement and combination of logic gates. For example, an adder is one of them. The device achieves the addition of two numbers in the electronic field and saves the result-in computer science, such a method to achieve a specific intention through a set of operations is called an algorithm. Finally, people successfully assembled a complete ALU and controller through a considerable number of logic gates. It is a considerable number, just look at CSIRAC, which may be the smallest practical electron tube computer. The machine contains 2 electron tubes, many of which are dual-purpose devices, which means that there are 2 to 4 logic devices in total.

vacuum tubes are obviously unable to manufacture large-scale gate circuits. Expensive, unstable (especially in large quantities), bloated, high in energy consumption, and not fast enough-although far beyond mechanical switching circuits. All this led to their replacement by transistors in the 196s. The latter is smaller in size, easy to operate, high in reliability, more energy-saving and lower in cost.

integrated circuits are the foundation of today's electronic computers. After the 196s, transistors began to be gradually replaced by integrated circuits that placed a large number of transistors, other various electrical components and connecting wires on a silicon board. In 197s, ALU and controller, as two parts of CPU, began to be integrated into a chip, which was called "microprocessor". Along the development history of integrated circuits, we can see that the number of integrated devices on a chip has increased rapidly. The first integrated circuit only contained dozens of components, and by 26, the number of transistors on an Intel Core Duo processor was as high as 151 million.

whether it is an electron tube, a transistor or an integrated circuit, they can be used as "storage" components in the stored program architecture by using a trigger design mechanism. In fact, flip-flops are indeed used as small-scale ultra-high-speed storage. However, almost no computer design uses triggers for large-scale data storage. The earliest computers used Williams tubes to send signals to a TV screen or several mercury delay lines (through which sound waves pass)