What is a computer?

An electronic computer, commonly known as a computer, is a machine that processes data according to a series of instructions. Related technical research is called computer science, and data-centered research is called information technology.

There are many kinds of computers. In fact, computers are generally tools for processing information. According to Turing machine theory, a computer with the most basic functions should be able to do anything that other computers can do. Therefore, regardless of time and storage factors, all personal digital assistants (PDA) and supercomputers should be able to accomplish the same job. In other words, even computers with the same design should be used for various tasks, from company salary management to unmanned spacecraft control, as long as they are modified accordingly. Due to the rapid progress of science and technology, the next generation of computers can always significantly surpass their predecessors in performance, which is sometimes called "Moore's Law".

Computers have different forms in composition. Early computers were as big as a house, but today some embedded computers may be smaller than a deck of playing cards. Of course, even today, there are still a large number of supercomputers serving large organizations for special scientific computing or transaction processing needs. Relatively small computers designed for personal applications are called microcomputers, or simply microcomputers. We usually mention this when we use the word "computer" in our daily life today. However, the most common application form of computer now is embedded. Embedded computers are usually relatively simple and small in size, and are used to control other devices-whether airplanes, industrial robots or digital cameras.

The above definition of electronic computer includes many special devices that can calculate or have limited functions. But when it comes to modern electronic computers, its most important feature is that any electronic computer can simulate the behavior of any other computer as long as it is given the correct instructions (only limited by the storage capacity and execution speed of the electronic computer itself). Therefore, compared with early electronic computers, modern electronic computers are also called general electronic computers.

history

ENIAC is a milestone in the history of computer development. The English word "computer" originally refers to a person engaged in data calculation. And they often need to use some mechanical computing devices or analog computers. The ancestors of these early computing devices include abacus and Antioch-Kitera mechanism, which can be traced back to 87 BC and used by the ancient Greeks to calculate planetary motion. With the re-prosperity of mathematics and engineering in Europe at the end of the Middle Ages, Wilhelm Schickard took the lead in developing the first computing device in Europe in 1623, which is a "computational clock" that can add and subtract numbers within six digits and output answers through ringtones. Use the rotating gear for operation.

1642, French mathematician Pascal improved the slide rule on the basis of William Oughtred, and was able to perform eight-bit calculation. It also sold many products and became fashionable goods at that time.

180 1 year, Joseph Marie Jacquard improved the design of loom, in which he used a series of punched paper cards as a program to weave complex patterns. Although jacquard loom is not considered as a real computer, its appearance is indeed an important step in the development of modern computers.

Charles. Babic was the first person who conceived and designed a fully programmable computer in 1820. However, due to technical conditions, financial constraints, and unbearable constant repair of the design, this computer never came out in his lifetime. By the end of19th century, many technologies proved to be of great significance to computer science appeared one after another, including punched cards and vacuum tubes. Hermann Hollerith designed a machine for tabulation, which realized large-scale automatic data processing by using punched cards.

In the first half of the 20th century, in order to meet the needs of scientific calculation, many single-purpose and increasingly complex analog computers were developed. These computers are mechanical or electronic models based on the specific problems they aim at. In 1930s and 1940s, the performance of computers became stronger, the universality was improved, and the key functions of modern computers continued to increase.

From 65438 to 0937, Claude elwood Shannon published his great paper Symbol Analysis in Relays and Switching Circuits, in which the application of digital electronic technology was mentioned for the first time. He showed people how to use switches to realize logical and mathematical operations. Since then, he has further consolidated his ideas by studying Nivard Bush's differential simulator. This is an important moment, marking the beginning of the design of binary electronic circuits and the application of logic gates. As the pioneers of the birth of these key ideas, it should include: Almon Strowger, who applied for a patent for a device containing logic gates; Nicholas? Nikola tesla, as early as 1898, applied for circuit equipment with logic gates; Lee De Forest, in 1907, he replaced the relay with a vacuum tube.

The Amiga 500 computer produced by Commodore Company in 1980s, along such a long journey, it is quite difficult to define the so-called "first electronic computer". 194 1 12 in may, Konrad Zuse completed his electromechanical equipment "Z3", which was the first computer with automatic binary mathematical calculation and feasible programming functions, but it was not an "electronic" computer. In addition, other notable achievements mainly include: atanasoff-Berry computer, which was born in the summer of 194 1, is the first electronic computer in the world, and it uses vacuum tube calculator, binary values and reusable memory; 1943 The mysterious colossus computer exhibited in Britain really tells people that it is reliable to use vacuum tube and can realize electrification reprogramming, although its programming ability is extremely limited. Harvard Mark I; Harvard university; And binary-based "ENIAC" (ENIAC, 1944), which is the first computer with general purpose, but its structural design is not flexible enough, so every reprogramming means reconnecting the electrical and physical circuits.

The team that developed Eneike further improved the design according to its defects, and finally presented the Von Neumann structure (program storage architecture) that we are familiar with today. This system is the foundation of all computers today. In the middle and late 1940s, a large number of computers based on this system began to be developed, among which Britain was the earliest. Although the first machine developed and put into operation was the "Small Experimental Machine" (SSEM), the practical machine really developed is probably EDSAC.

Throughout the 1950s, vacuum tube computers dominated. 1958 On September 2nd, under the leadership of robert noyce (founder of Intel Corporation), the integrated circuit was invented. Soon after, the microprocessor came out. Computers designed between 1959 and 1964 are generally called second-generation computers.

In the 1960s, transistor computers took its place. Transistors are smaller, faster, cheaper and more reliable, which makes them commercialized. Computers from 1964 to 1972 are generally called the third generation computers. A large number of integrated circuits are used, and the typical model is IBM360 series.

In the 1970s, the introduction of integrated circuit technology greatly reduced the production cost of computers, and computers began to move towards thousands of households. Computers after 1972 are customarily called the fourth generation computers. Based on VLSI and later VLSI. 1 972 On April1day, Intel introduced the 8008 microprocessor. 1976 Stephen Woznak and Steve Jobs founded Apple Computer Company. And launched the Apple I computer. 1977 Apple's second-generation computer was released in May. 1 979 June1day, Intel released an 8-bit 8088 microprocessor.

From 65438 to 0982, microcomputers became popular and entered schools and families in large numbers. 1982 65438+ 10 Commodore 64 computer release, price: 595 USD. 1982 released on February 80286. The clock frequency is increased to 20MHz, which increases the protection mode and can access 16M memory. Support virtual memory above 1GB. It executes 2.7 million instructions per second and integrates134,000 transistors.

1990165438+1October: the first generation of MPC (multimedia personal computer standard) was released. The processor was at least 80286/ 12MHz, and later it was increased to 80386SX/ 16 MHz, and the transmission rate of the optical drive was at least 150 KB/ sec. 1994 10 June10, Intel released the 75 MHz Pentium processor. 1 995165438+1October1Pentium Pro released. The main frequency can reach 200 MHz, 440 million instructions are completed per second, and 5.5 million transistors are integrated. 1997 65438+1On October 8th, Intel released Pentium MMX. Games and multimedia functions have been enhanced.

Since then, computers have changed with each passing day, and Moore's Law published in 1965 has been continuously proved, and the prediction is still applicable in the next 10~ 15 years.

Computer development:

/kloc-before 0/9th century

First of all, the pioneer of the mechanical computer era

In western Europe, the great social changes from the Middle Ages to the Renaissance greatly promoted the development of natural science and technology, and people's creativity, which had been suppressed by theocracy for a long time, was released unprecedentedly. Among them, making a machine that can help people calculate is one of the most dazzling sparks. Since then, one scientist after another has made unremitting efforts to turn the spark of this thought into a torch to guide mankind to the kingdom of freedom. However, limited by the overall level of science and technology at that time, most of them failed. This is the same fate of pioneers: they often don't see rich fruits. Later generations should be able to taste some sweat and tears when enjoying this sweetness. ...

16 14: John Napier, a scot (1550- 16 17) published a paper in which he mentioned that he had invented an ingenious device that could calculate four operations and square root operations.

1623: Wilhelm Schickard (1592-1635) made a' computational clock' that can add and subtract numbers within six digits, and the answer can be output by ringing. Turn the gear to operate.

William Oughtred (1575- 1660) invented the slide rule.

1642: Pascal, a French mathematician, improved the slide rule on the basis of William Oughtred, which can perform eight-bit calculation. But also sold a lot and became a fashionable commodity.

1668: Samuel Mohr and (1625- 1695) made a non-decimal addition device, which is suitable for counting coins.

167 1: German mathematician gottfried leibniz designed a machine that can multiply, and the final answer can reach 16 digits at most.

1775: Charles of England successfully built a machine similar to Leibniz's computer. But more advanced.

1776: German Mathieus Hahn successfully made a multiplier.

1801:Joseph-Maire Jacuard developed an automatic loom that can be controlled by punched cards.

1820: Frenchman charles xavier thomas de colmar (1785- 1870) successfully produced the first finished computer, which is very reliable and can be put on the desktop, and has been sold in the market for more than 90 years.

1822: An Englishman, Charles Babbage (1792- 187 1), designed a difference machine and an analyzer, in which the design theory is very advanced, similar to an electronic computer a hundred years later, especially the design using card input programs and data was adopted by later generations.

1832: Babbage and Joseph Clement made a finished product of differential expansion, which can perform 6-bit operation at first. Later, it grew to twenty or thirty people, and the scale was almost as big as a house. The results are output in the form of perforations. But due to the manufacturing technology at that time, their design was difficult to make.

1834: George Scheutz of Stockholm made a differential extender out of wood.

1834: Babbage conceived to build a general analyzer to store programs and data in a read-only memory (punched card). Babbage will continue his research work in the future. In 1840, the operand is increased to 40 bits, which basically realizes the idea of control center (CPU) and stored program, and the program can jump within a few seconds according to conditions.

1842: Babbage's differential extension project was cancelled by the government because of the high development cost. But he still spent a lot of time and energy on his analytical machine.

1843: Scheutz and his son Edvard Scheutz built a differential extension, and the Swedish government agreed to continue to support their research work.

1847: Babbage spent two years designing a relatively simple 3 1 bit differential extension, but no one was interested in and supported him to build this machine. But later, the London Science Museum copied the machine with modern technology and found that it really worked accurately.

1848: British mathematician george boole founded binary algebra. Almost a century in advance paved the way for modern binary computers.

1853: To Babbage's delight, Scheutzes has successfully made a real number proportional difference extension, which can perform the operation of 15 digits. Output Babbage's imagination. Later, Brian Donkin of London built a more reliable second one.

1858: The first watchmaker was acquired by the Dudley Observatory in Albany. The second one was bought by the British government. But the observatory didn't make full use of it and was later sent to the museum. But the second one was lucky enough to take a long time.

187 1: Babbage made some parts of analyzers and printers.

1878: Ramon Verea, a Spaniard from new york, made a successful desktop calculator. Faster than anything mentioned above. But he is not interested in putting it on the market. He just wants to prove that the Spanish do better than the Americans.

1879: an investigation Committee began to study the feasibility of the analyzer, and finally they concluded that the analyzer could not work at all. Babbage was already dead. After the investigation, people completely forgot his analytical machine. Except howard aiken.

1885: More computers appeared in this period. Such as the United States, Russia, Sweden and so on. They began to use grooved cylinders instead of easily damaged gears.

1886: dorr e felt of Chicago (1862-1930) made the first calculator operated by keys, which was very fast. As soon as the keys are lifted, the result comes out.

1889: Felt launches desktop printing calculator.

1890: 1890 US Census. The census of 1880 took seven years to be counted. This means that the statistics of 1890 will exceed 10 years. The US Census Department hopes to get a machine to help improve the efficiency of the census. Herman hollerith founded the watchmaking company, and later his company developed into IBM. Based on Babbage's invention, this paper uses punched cards to store data and designs a machine. Results It took only six weeks to get accurate data (62,622,250 people). Herman hollerith made a fortune.

1892: William S. Burroughs, Sao Tome and Principe (1857- 1898) successfully manufactured a machine with more functions than felt, which really started the office automation industry.

1896: herman hollerith founded the predecessor of IBM. 1900~ 19 10

1906: Charles Babbage's son, Henry Babbage, with the support of r·w· Munro, completed the analyzer designed by his father, but only proved that it could work, but did not launch it as a product.

Second, in the early days of electronic computers,

Before that, computers were all based on mechanical operation mode. Although some products began to introduce some electrical contents, they were all subordinate and mechanical, and did not enter the flexible field of computer: logical operation. Later, with the rapid development of electronic technology, computers began the transition from mechanical age to electronic age. Electronics is becoming more and more the subject of computers, while machinery is becoming more and more subordinate. Their status has changed, and computers have begun to change qualitatively. The following are the main events in this transitional period:

Lee De Forest of the United States invented the electron tube. Before that, it was impossible to build a digital electronic computer. This laid the foundation for the development of electronic computers.

1920~ 1930

February 1924: The epoch-making company IBM was founded.

1930~ 1940

1935: IBM introduced IBM 60 1. This is a punched card computer that can calculate multiplication in one second. This machine plays an important role in both natural science and business. About 1500 sets have been built.

1937: Allen m turing of Cambridge university (19 12- 1954) published a paper and put forward a mathematical model called' turing machine' by later generations.

1937: george stibitz of Bell Laboratories shows a device that uses relays to represent binary. Although it is only a monitor, it is the first binary electronic computer.

1938:Claude. Shannon published a paper on the logical representation of using relays.

1938: Konrad Zuse of Berlin and his assistants have completed a computer in the form of mechanically programmable binary, and its theoretical basis is Boolean algebra. It was later named Z 1. It's more powerful, and it uses something like movie film as the storage medium. Can operate seven-digit exponent and 16-digit decimal. You can enter numbers with the keyboard and display the results with the light bulb.

1 939 65438+1October1:david hume Park Jung Su and William packard of California built HP computers in their garage. The name was decided by two people by flipping a coin. Including some of their names.

1939 165438+1October: American John v atanasoff and his student Clifford berry completed the16-bit adder, which is the first vacuum tube computer.

1939: At the beginning of World War II, the military demand greatly promoted the development of computer technology.

1939: Zuse and Schreyer started to develop Z2 computer based on their Z 1 computer. And improve its storage and calculation unit with relay. But this project was interrupted for a year because of Zuze's military service.

1939/1940: Schreyer completed a 10-bit adder with vacuum tube and neon lamp as storage device.

1940~ 1950

1940 65438+1October: Samuel Williams and Stibitz of Bell Laboratories successfully built a computer that can perform complex operations. A large number of relays are used, and some telephone technologies are used for reference, and advanced coding technology is adopted.

194 1 summer: atanasoff and student Berry completed a computer that can solve linear algebraic equations, named' ABC' (atanasoff Berry Computer), with capacitors as memory and punched cards as auxiliary memory. Those holes are actually burned out. The clock frequency is 60HZ, and it takes one second to complete an addition operation.

194 1 year 65438+February: Z3 computer was developed by German company Zuse. This is the first programmable electronic computer. It can handle seven-digit exponents and 14-digit decimals. A large number of vacuum tubes are used. You can add it three or four times a second. A multiplication takes 3 to 5 seconds.

Computers from 1943: 1943 to 1959 are usually called the first generation computers. Vacuum tubes are used, all programs are written in machine code, and punched cards are used. The typical machine is UNIVAC.

1943 65438+1October: Mark I, the automatic sequence control computer was successfully developed in the United States. The whole machine is 5 1 ft long, weighs 5 tons and has 750,000 parts. It uses 3304 relays and 60 switches as mechanical read-only memory. The program is stored on paper tape, and the data can come from paper tape or card reader. Used to calculate the ballistic firing table of the US Navy.

1April, 943: Max Newman, Wynn-Williams and their research team successfully developed '(Heath Robinson, which is a password cracker, not a computer strictly speaking. But it uses some logic elements and vacuum tubes, and its optical device can read 2000 characters per second. It also has epoch-making significance.

1September, 943: Williams and Stiby completed the' relay interpolator', which was later named as' Ⅱ relay calculator'. This is a programmable computer. Paper tape is also used to input programs and data. Its operation is more reliable, each number is represented by 7 relays, and floating-point operation can be performed.

194365438+February: Britain introduced the earliest programmable computer, including 2400 vacuum tubes, in order to decipher German passwords, which can translate about 5000 characters per second, but it was destroyed soon after use. It is said that there was a mistake in translating Russian.

1946: ENIAC (electronic digital integrator and computer): the first real digital electronic computer. 1943 started and 1946 was completed. The leaders are John W. Mochri and J. Presper eckert. It weighs 30 tons, 18000 tubes, and its power is 25 kilowatts. It is mainly used to calculate trajectory and develop hydrogen bombs.

Third, the development of transistor computers.

Although the computer in vacuum tube era has entered the category of modern computers, its large size, high energy consumption, many faults and high price have greatly restricted its popularization and application. It was not until the invention of the transistor that the electronic computer found the starting point for taking off.

1947: William B. shockley, john bardeen and Walter H. bratton of Bell Laboratories invented the transistor, which opened a new era of the electronic age.

1949: EDSAC: Wilkes of Cambridge University and his team built a computer to store programs. I/O devices are still paper tapes.

1949: EDVAC (electronic discrete variable computer): the first computer to use magnetic tape. This is a breakthrough, and the program can be stored on it many times. This machine was put forward by john von neumann.

1949:' The number of computers in the future will not exceed 1.5 tons.' This was a bold prediction made by science magazines at that time.

1950~ 1960

1950: The floppy disk was invented by Sakamoto of Imperial University in Tokyo. Its sales rights were acquired by IBM. Create a new era of storage.

1950: British mathematician and computer pioneer alan turing said that computers will have human intelligence. If a person talks to a machine, he can't tell whether it is a machine or a person talking to the question being asked and answered, then the machine has human intelligence.

1951:Grace Murray Hopper completes the high-level language compiler.

195 1: cyclone: the first computer-controlled real-time defense system of the us air force has been developed.

1951:univac-1:the first commercial computer system. Designers: J Presper eckert and John Mockley. It was used by the US Census Bureau for census, which indicated that the application of computer entered a new era of commercial application.

1952: EDVAC (electronic discrete variable computer): designed by von neumann. Name: Electronic Discrete Variable Computer.

1953: At this time, there are about 100 computers running in the world.

1953: magnetic core memory has been developed.

1954: john backus of IBM and his research team began to develop FORTRAN (formula translation), which was completed in 1957. It is an advanced computer language suitable for scientific research.

1956: the first conference on artificial intelligence was held at Dartmouth college.

1957: IBM successfully developed the first dot matrix printer.

1957: FORTRAN high-level language was successfully developed.

Fourth, integrated circuits, modern computers that fly high.

Although the use of transistors greatly reduces the size of the computer, reduces the price and reduces the failure. But it is still far from people's requirements, and there is a great demand for computers in all walks of life. It has become a top priority to produce more powerful, lighter and cheaper machines, and the invention of integrated circuits, like "timely rain", happened in spring. Its high integration not only reduces the volume, but also accelerates the speed and reduces the faults. People began to make revolutionary microprocessors. After years of accumulation, computer technology has finally embarked on the highway paved with silicon.

1958 September 12: Under the leadership of robert noyce (founder of Intel Corporation), the integrated circuit was invented. Soon after, the microprocessor came out. However, because the technology of Japanese companies was used for reference when the microprocessor was invented, Japan did not get the benefits it deserved, so it did not recognize its patent. It took Japan 30 years to admit it, allowing Japanese companies to get some profits from it. But by 200 1, this patent will be invalid.

Computers designed between 1959: 1959 and 1964 are generally called second-generation computers. A large number of transistors and printed circuits are used. Computers are getting smaller and smaller and have more and more functions. It can run FORTRAN and COBOL, and receive English character commands. A large number of application software appeared.

Because there are too many words, I'd better go to the following webpage to have a detailed look. sorry