There are many kinds of computers. In fact, computers are generally tools for processing information. According to Turing machine theory, a computer with the most basic functions should be able to do anything that other computers can do. Therefore, regardless of time and storage factors, all personal digital assistants (PDA) and supercomputers should be able to accomplish the same job. In other words, even computers with the same design should be used for various tasks, from company salary management to unmanned spacecraft control, as long as they are modified accordingly. Due to the rapid progress of science and technology, the next generation of computers can always significantly surpass their predecessors in performance, which is sometimes called "Moore's Law".
Computers have different forms in composition. Early computers were as big as a house, but today some embedded computers may be smaller than a deck of playing cards. Of course, even today, there are still a large number of supercomputers serving large organizations for special scientific computing or transaction processing needs. Relatively small computers designed for personal applications are called microcomputers, or simply microcomputers. We usually mention this when we use the word "computer" in our daily life today. However, the most common application form of computer now is embedded. Embedded computers are usually relatively simple and small in size, and are used to control other devices-whether airplanes, industrial robots or digital cameras.
The above definition of electronic computer includes many special devices that can calculate or have limited functions. But when it comes to modern electronic computers, its most important feature is that any electronic computer can simulate the behavior of any other computer as long as it is given the correct instructions (only limited by the storage capacity and execution speed of the electronic computer itself). Therefore, compared with early electronic computers, modern electronic computers are also called general electronic computers.
history
ENIAC is a milestone in the history of computer development. The English word "computer" of computer originally refers to people who are engaged in data calculation. And they often need to use some mechanical computing devices or analog computers. The ancestors of these early computing devices, including abacus and Antikythera mechanism, can be traced back to 87 BC, and the ancient Greeks used it to calculate the motion of planets. With the prosperity of mathematics and engineering in Europe at the end of the Middle Ages, Wilhelm Schickard took the lead in developing the first computing equipment in Europe in 1623.
180 1 year, Joseph Marie Jacquard improved the design of loom, in which he used a series of punched paper cards as a program to weave complex patterns. Although jacquard loom is not considered as a real computer, its appearance is indeed an important step in the development of modern computers.
Charles. Babic was the first person who conceived and designed a fully programmable computer in 1820. However, due to technical conditions, financial constraints, and unbearable constant repair of the design, this computer never came out in his lifetime. By the end of19th century, many technologies proved to be of great significance to computer science appeared one after another, including punched cards and vacuum tubes. Hermann Hollerith designed a machine for tabulation, which realized large-scale automatic data processing by using punched cards.
In the first half of the 20th century, in order to meet the needs of scientific calculation, many single-purpose and increasingly complex analog computers were developed. These computers are mechanical or electronic models based on the specific problems they aim at. In 1930s and 1940s, the performance of computers became stronger, the universality was improved, and the key functions of modern computers were constantly increased.
Claude? 1937, claude shannon published his great paper "Symbol Analysis in Relays and Switching Circuits", in which the application of digital electronic technology was mentioned for the first time. He showed people how to use switches to realize logical and mathematical operations. Since then, he has further consolidated his ideas by studying Nivard Bush's differential simulator. This is an important moment, marking the beginning of the design of binary electronic circuits and the application of logic gates. As the pioneers of the birth of these key ideas, it should include: Almon Strowger, who applied for a patent for a device containing logic gates; Nicholas? Nikola tesla, as early as 1898, applied for circuit equipment with logic gates; Lee De Forest, in 1907, he replaced the relay with a vacuum tube.
It is quite difficult to define the so-called "first electronic computer" along such a long distance. 194 1 12 in may, Konrad Zuse completed his electromechanical equipment "Z3", which was the first computer with automatic binary mathematical calculation and feasible programming functions, but it was not an "electronic" computer. In addition, other noteworthy achievements mainly include: Atanasoff-Berry computer, which was born in the summer of 194 1, is a computer with special purpose, but it uses a vacuum tube calculator, binary values and reusable memory; 1943 The mysterious colossus computer exhibited in Britain really tells people that it is reliable to use vacuum tubes and can realize electrification reprogramming, although its programming ability is extremely limited. Harvard Mark I; Harvard university; And binary-based "ENIAC" (ENIAC, 1944), which is the first computer with general purpose, but its structural design is not flexible enough, so every reprogramming means reconnecting the electrical and physical circuits.
The team that developed Eni further improved the design according to its defects, and finally launched Feng? Neumann architecture (program storage architecture). This system is the foundation of all computers today. In the middle and late 1940s, a large number of computers based on this system began to be developed, among which Britain was the earliest. Although the first machine developed and put into operation was the "Small Experimental Machine" (SSEM), the practical machine really developed is probably EDSAC.
Throughout the 1950s, vacuum tube computers dominated. In the 1960s, transistor computers took its place. Transistors are smaller, faster, cheaper and more reliable, which makes them commercialized. In the 1970s, the introduction of integrated circuit technology greatly reduced the production cost of computers, and computers began to move towards thousands of households.
[edit] principle
The main structure of personal computer:
director
mainboard
Central processing unit (microprocessor)
main storage
expansion card
Power?Supply?
CD-ROM drive
Auxiliary memory (hard disk)
keyboard
mouse
Although computer technology has developed rapidly since the birth of the first electronic general-purpose computer in the 1940s, today's computers still basically adopt the stored program structure, that is, Feng? Neumann architecture. This structure realizes a practical general-purpose computer.
The stored program structure describes the computer as four main parts: arithmetic logic unit (ALU), control circuit, memory and input/output device (I/O). These components are connected by a set of flat cables (especially when a set of wires is used for data transmission with different intentions, it is also called a bus) and are driven by a clock (of course, some other events may also drive the control circuit).
Conceptually, the memory of a computer can be regarded as a group of "cells". Each "cell" has a number called an address; But also can store smaller fixed-length information. This information can be instructions (telling the computer what to do) or data (the processing object of instructions). In principle, each "cell" can store any one of them.
Arithmetic logic unit (ALU) can be called the brain of a computer. It can do two operations: the first is arithmetic operation, such as addition and subtraction of two numbers. The function of arithmetic operators in ALU is very limited. In fact, some ALUs don't support multiplication and division at circuit level at all (because users can only do multiplication and division through programming). The second is comparison operation, that is, given two numbers, ALU compares them to determine which is larger.
Input-output system is a means for computer to receive external information and feed back the operation results to the outside world. For a standard personal computer, the input devices are mainly keyboard and mouse, while the output devices are monitors, printers and many other I/O devices that can be connected to the computer.
The control system connects all parts of the computer. Its function is to read instructions and data from memory and input/output devices, decode instructions, send correct inputs that meet the requirements of instructions to ALU, and tell ALU how to process these data and where to return the resulting data. An important part of the control system is a counter, which is used to record the address of the current instruction. Usually, the counter accumulates with the execution of the instruction, but sometimes if the instruction indicates a jump, the rule is not followed.
Since 1980s, ALU and control unit (both integrated into central processing unit (CPU)) have been gradually integrated into an integrated circuit, which is called microprocessor. The working mode of this computer is very intuitive: in a clock cycle, the computer first obtains instructions and data from the memory, then executes the instructions, stores the data, and then obtains the next instruction. This process is repeated until a termination instruction is obtained.
According to the controller's explanation, the instruction set executed by the arithmetic unit is a set of simple instructions with a very limited number. Generally, it can be divided into four categories: 1), data movement (for example, copying a numerical value from storage unit A to storage unit B)2), number and logic operation (for example, calculating the sum of storage unit A and storage unit B and returning the result to storage unit C)3), conditional verification (for example, if the numerical value in storage unit A is 100, then the next instruction.
Instructions, like data, are represented in binary in a computer. For example, 10 1 10000 is a copy instruction code of Intel x86 microprocessor. The instruction set supported by the computer is the machine language of the computer. Therefore, using the popular machine language will make the established software easier to run on the new computer. Therefore, people who develop commercial software usually only pay attention to one or several different machine languages.
More powerful small computers, large computers and servers may be different from the above computers. They usually share tasks with different CPUs to perform. Nowadays, microprocessors and multi-core personal computers are also developing in this direction.
Supercomputers usually have a significantly different architecture from basic stored program computers. They usually consist of thousands of CPUs, but these designs seem to be only useful for specific tasks. In various computers, some microcontrollers use Harvard architecture to separate programs and data.