Category: Computer/Network
Problem description:
?As the title
Analysis:
Computer development history
Before the 19th century
1. Pioneers in the age of mechanical computers
In Western Europe, the great social changes from the Middle Ages to the Renaissance greatly promoted the development of natural science. With the development of science and technology, people's creativity, which has long been suppressed by theocracy, has been released unprecedentedly. Among them, making a machine that can help people perform calculations is one of the most dazzling ideological sparks. Since then, one scientist after another has worked tirelessly to turn this spark of thought into a torch that leads mankind into the kingdom of freedom. But limited to the overall level of science and technology at that time, most of them failed. This was the same fate of pioneers: they often failed to see fruitful fruits. When future generations enjoy this sweetness, they should be able to taste some sweat and tears...
1614: Scotsman John Napier (1550-1617) published a paper in which he mentioned Invented an ingenious device that can calculate the four arithmetic operations and the square root operation.
1623: Wilhelm Schickard (1592-1635) made a 'calculating clock' that could add and subtract numbers within six digits and output the answer through ***. It is operated by turning the gear.
1625: William Oughtred (1575-1660) invented the slide rule
1642: French mathematician Pascal improved the slide rule based on WILLIAM Oughtred's slide rule and could perform eight-digit calculations calculate. And many of them were sold, becoming a fashionable commodity.
1668: The Englishman Samuel Morl (1625-1695) built a non-decimal adding device suitable for counting coins.
1671: German mathematician Gottfried Leibniz designed a machine that can perform multiplication, and the final answer can be up to 16 digits.
1775: Charles of England successfully built a machine similar to Leibniz's computer. But a little more advanced.
1776: German Mathieus Hahn successfully built a multiplier.
1801: Joseph-Maire Jacuard develops an automatic loom controlled by punched cards.
1820: Frenchman Charles Xavier Thomas de Colmar (1785-1870) successfully produced the first finished computer. It was very reliable and could be placed on the desktop. It remained on the market for more than 90 years. sell.
1822: The Englishman Charles Babbage (1792-1871) designed the Difference Engine and the Analytical Engine. The theory designed was very advanced, similar to the electronic computers a hundred years later, especially the use of cards to input programs and data. The design was adopted by later generations.
1832: Babbage and Joseph Clement built a finished difference engine that could begin to perform 6-digit operations. Later, it grew to 20 or 30 people, and the size was nearly as big as a house. The results are output in the form of perforations. However, due to the manufacturing technology at the time, their designs were difficult to produce.
1834: Gee Scheutz of Stockholm built a difference engine out of wood.
1834: Babbage envisioned building a general-purpose analytical engine that would store programs and data in read-only memory (punched cards). Babbage continued his research work in later years, and in 1840 he would operate The number has been increased to 40 digits, and the idea of ??a control center (CPU) and stored program has been basically realized, and the program can jump according to conditions, and can perform general addition within a few seconds, and multiplication and division within a few minutes.
1842: Babbage’s Difference Engine project was canceled by *** due to high development costs. But he still spent a lot of time and energy on his analytical engine research.
1843: Scheutz and his son Edvard Scheutz built a difference engine, and the Swedish government agreed to continue supporting their research.
1847: Babbage spent two years designing a simpler, 31-bit difference engine, but no one was interested and supported him in building this machine. But later the Science Museum in London used modern technology to replicate the machine and found that it did work accurately.
1848: British mathematician Gee Boole founded binary algebra. It paved the way for modern binary computers almost a century in advance.
1853: To Babbage's delight, Scheutzes succeeded in building a true proportional difference engine, capable of performing 15-digit operations. Output the results as Babbage envisioned. Later, Brian Donkin in London built a second, more reliable unit.
1858: The first tabulating machine is purchased by Dudley Observatory in Albany. The second one was bought by the British ***. But the observatory did not make full use of it, and it was later sent to a museum. The second one was fortunately used for a long time.
1871: Babbage builds parts of the Analytical Engine and the printer.
1878: Ramon Verea, a Spanish man in New York, successfully manufactured a desktop calculator. Faster than the previously mentioned ones. But he wasn't interested in bringing it to market, just showing that the Spanish could do it better than the Americans.
1879: A committee of inquiry begins to study the feasibility of the Analytical Engine, and they conclude that it simply cannot work. By this time Babbage had passed away. After the investigation, his Analytical Engine was completely forgotten. But Howard Aiken is an exception.
1885: More computers emerged during this period. Such as the United States, Russia, Sweden, etc. They began replacing the failure-prone gears with grooved cylinders.
1886: Dorr E. Felt (1862-1930) of Chicago built the first calculator that was operated by keys. It was very fast and the results came out when the keys were lifted.
1889: Felt launches desktop printing calculator.
1890: 1890 United States Census. The 1880 census took seven years to compile. This means that the 1890 statistics will be older than 10 years. The U.S. Census Bureau hopes to get a machine to help improve the efficiency of the census. Herman Hollerith, the man who founded the tabulating machine company that later became IBM. Drawing on Babbage's invention, he used punched cards to store data and designed a machine. It took only 6 weeks to produce accurate data (*********** people). Herman Hollerith made a fortune.
1892: William S. Burroughs (1857-1898) of Sao Tome and Principe successfully built a machine with more functionality than Felt, truly creating the office automation industry.
1896: Herman Hollerith founded the predecessor of IBM. 1900~1910
1906: Henry Babbage, son of Charles Babbage, with the support of R. W. Munro, completed the analytical engine designed by his father, but he could only prove that it worked, but did not use it as a product roll out.
2. The first days of electronic computers
Computers before this were all based on mechanical operation. Although some products began to introduce some electrical content, they were all subordinate to Mechanically, it has not yet entered the flexible realm of computers: the field of logical operations.
After that, with the rapid development of electronic technology, computers began to transition from mechanical to electronic era. Electronics became more and more the main body of computers, and machinery became more and more subordinate. The status of the two changed. Computers A qualitative change also began. The following are the main events of this transition period:
1906: Lee De Forest in the United States invented the electronic tube. Before this time it was impossible to build a digital electronic computer. This laid the foundation for the development of electronic computers.
1920~1930
February 1924: IBM, an epoch-making company was established
1930~1940
1935: IBM launches the IBM 601 machine. It was a punched card computer that could do multiplication in one second. This machine has an important position both in natural science and in a commercial sense. Approximately 1,500 units were built.
1937: Alan M. Turing (1912-1954) of the University of Cambridge in England published his paper and proposed a mathematical model known as the 'Turing machine' by later generations.
1937: Gee Stibitz of BELL Laboratories demonstrates a device that uses relays to represent binary numbers. Although it was just a showpiece, it was the first binary electronic computer.
1938: Claude E. Shannon published a paper on logic representation using relays.
1938: Konrad Zuse and his assistants in Berlin completed a mechanically programmable binary computer based on Boolean algebra. Later named Z1. It is relatively powerful and uses something similar to film as a storage medium. Can operate on seven-digit exponents and 16-digit decimal places. You can use a keyboard to enter numbers and a light bulb to display the results.
January 1, 1939: David Hewlet and William Packard of California built the Hewlett-Packard computer in their garage. The name was decided by the two of them by tossing a coin. Include part of both people's names.
November 1939: American John V. Atanasoff and his student Clifford Berry completed a 16-bit adder, which was the first vacuum tube computer.
1939: At the beginning of World War II, military needs greatly promoted the development of computer technology.
1939: Zuse and Schreyer began developing the Z2 computer based on their Z1 computer. and improve its storage and computing units with relays. But the project was interrupted for a year due to Zuse's military service.
1939/1940: Schreyer completed a 10-bit adder using vacuum tubes and used neon lamps as storage devices.
1940~1950
January 1940: Samuel Williams and Stibitz of Bell Labs successfully built a computer that can perform complex operations. Relays were used extensively, and some telephone technology was borrowed, and advanced coding technology was adopted.
Summer 1941: Atanasoff and his student Berry completed a computer that could solve linear algebraic equations, named 'ABC' (Atanasoff-Berry Computer), which used capacitors as memory and punched cards as auxiliary memory. The holes are actually 'burned' in. The clock frequency is 60HZ, and it takes one second to complete an addition operation.
December 1941: German Zuse completed the development of the Z3 computer. This was the first programmable electronic computer. Can handle 7-digit exponent and 14-digit decimal place. A lot of vacuum tubes were used. It can perform 3 to 4 addition operations per second. A multiplication takes 3 to 5 seconds.
1943: Computers from 1943 to 1959 are often referred to as the first generation of computers. Vacuum tubes were used, and all programs were written in machine code, using punched cards.
A typical machine is: UNIVAC.
January 1943: Mark I, the automatic sequence control computer was successfully developed in the United States. The entire machine is 51 feet long, weighs 5 tons, has 750,000 parts, and uses 3,304 relays and 60 switches as mechanical read-only memory. Programs are stored on paper tapes, and data can come from paper tapes or card readers. Used to calculate ballistic firepower tables for the U.S. Navy.
April 1943: Max Newman, Wynn-Williams and their research team successfully developed 'Heath Robinson', a code-breaking machine, strictly speaking not a computer. But it uses some logic components and vacuum tubes, and its optical device can read 2,000 characters per second. It is also of epoch-making significance.
September 1943: Williams and Stibitz completed the 'Relay Interpolator', later named the 'Model II Relay Calculator'. This is a programmable computer. Paper tape is also used to enter programs and data. Its operation is more reliable, each number is represented by 7 relays, and floating point operations can be performed.
December 1943: The earliest programmable computer was launched in the UK, including 2,400 vacuum tubes. It was intended to decipher German codes. It could translate about 5,000 characters per second, but it suffered from malfunctions shortly after use. To the destruction. It is said that there was an error in the Russian translation.
1946: ENIAC (Electronic Numerical Integrator and Computer): The first truly digital electronic computer. Development began in 1943 and was completed in 1946. The principals were John W. Mauchly and J. Presper Eckert. It weighs 30 tons, has 18,000 electron tubes, and has a power of 25 kilowatts. Mainly used for calculating ballistics and the development of hydrogen bombs.
3. The development of transistor computers
Although computers in the vacuum tube era have entered the category of modern computers, they are large in size, high in energy consumption, numerous in faults, and expensive. Its high cost greatly limits its popularization and application. It wasn't until the invention of the transistor that electronic computers found their starting point for take-off, and they couldn't stop it...
1947: Invented by William B. Shockley, John Bardeen and Walter H. Brattain of Bell Labs. The transistor ushered in a new era of electronics.
1949: EDSAC: Wilkes and his team at Cambridge University built a stored program computer. The input and output devices were still paper tapes.
1949: EDVAC (electronic discrete variable puter): the first computer to use magnetic tape. This is a breakthrough on which programs can be stored multiple times. This machine was proposed by John von Neumann.
1949: 'The computers of the future will not weigh more than 1.5 tons. 'This was a bold prediction from a scientific magazine at the time.
1950~1960
1950: The floppy disk was invented by Yoshiro Nakamats of Tokyo Imperial University. Its sales rights were obtained by IBM. Ushering in a new era of storage.
1950: British mathematician and computer pioneer Alan Turing said: Computers will have human intelligence. If a person talks to a machine, the person will not be able to distinguish between the questions raised and answered. Whether it is a machine or a human, then this machine has human intelligence.
1951: Grace Murray Hopper completes a high-level language compiler.
1951: Whirlwind: The U.S. Air Force's first computer-controlled real-time defense system is developed.
1951: UNIVAC-1: The first commercial computer system.
Designed by: J. Presper Eckert and John Mauchly. Being used by the U.S. Census Department for the census marks that computer applications have entered a new era of commercial applications.
1952: EDVAC (Electronic Discrete Variable Computer): Designed and completed under the leadership of Von Neumann. Name: Electronic Discrete Variable Computer.
1953: There were approximately 100 computers operating in the world at this time.
1953: Magnetic core memory is developed.
1954: John Backus of IBM and his research team began to develop FORTRAN (FORmula TRANslation), which was completed in 1957. It is a high-level computer language suitable for scientific research.
1956: The first conference on artificial intelligence is held at Dartmouth College.
1957: IBM successfully developed the first dot matrix printer.
1957: FORTRAN high-level language developed successfully.
4. Integrated circuits give modern computers the wings to take off
Although the use of transistors has greatly reduced the size, price, and failures of computers. But it is still far from people's requirements, and various industries have also produced greater demand for computers. It has become a top priority to produce more powerful, lighter and cheaper machines. The invention of integrated circuits is like a "timely rain". When spring happens. Its high degree of integration not only reduces the size, but also speeds up the speed and reduces faults. People began to build revolutionary microprocessors. After years of accumulation, computer technology has finally hit the highway paved with silicon.
September 12, 1958: Under the leadership of Robert Noyce (founder of INTEL Corporation), the integrated circuit is invented. Microprocessors were soon introduced. However, because Japan borrowed technology from Japanese companies when inventing microprocessors, Japan did not recognize its patents because Japan did not receive the benefits it deserved. It took 30 years for Japan to admit it, so that Japanese companies could get part of the profits from it. But by 2001, this patent expired.
1959: Computers designed between 1959 and 1964 are generally referred to as second-generation computers. Transistors and printed circuits were used extensively. The size of computers continues to shrink, and their functions continue to increase. They can run FORTRAN and COBOL and receive English character commands. A large number of application software emerged.
1959: Grace Murray Hopper began to develop the COBOL (COmmon Business-Orientated Language) language, which was completed in 1961.
1960~1970
1960: ALGOL: the first structured programming language launched.
1961: IBM's Kenneth Iverson introduces the APL programming language.
1963: PDP-8: DEC launches the first minicomputer.
1964: Computers from 1964 to 1972 are generally called third-generation computers. A large number of integrated circuits are used, and the typical model is the IBM360 series.
1964: IBM releases the PL/1 programming language.
1964: Release of the first series of IBM 360 compatible machines.
1964: DEC releases the PDB-8 minicomputer.
1965: Moore's Law is announced, processor performance doubles every year. Later its content was changed.
1965: Lofti Zadeh created fuzzy logic to deal with approximation problems.
1965: Thomas E. Kurtz and John Kemeny completed the development of the BASIC (Beginners All Purpose Symbolic Instruction Code) language. It is especially suitable for computer education and beginners, and has been widely promoted.
1965: Douglas Englebart proposed the idea of ??a mouse, but did not pursue further research. It was not until 1983 that it was widely adopted by Apple Computer Company.
1965: The first supercomputer CD6600 was successfully developed.
1967: Niklaus Wirth began to develop the PASCAL language, which was completed in 1971.
1968: Robert Noyce and several of his friends founded INTEL.
1968: Seymour Paper and his research team developed the LOGO language at MIT.
1969: The ARPANET project was launched, which was the prototype of the modern INTERNET.
April 7, 1969: The first network protocol standard RFC was launched.
1969: EIA (Electronic Industries Associa
1970~1980
1970: The first RAM chip was launched by INTEL with a capacity of 1K.
< p> 1970: Ken Thomson and Dennis Ritchie begin developing the UNIX operating system1970: The Forth programming language is developed
1970: The prototype of Inter, ARPA (Advanced Research Projects Agency neork). Basically completed. Started to open to non-military sectors, and many universities and commercial sectors began to access it.
November 15, 1971: Marcian E. Hoff successfully developed the first microprocessor 4004 at INTEL. It contains 2,300 transistors, is a 4-bit system, has a clock frequency of 108KHz, and executes 60,000 instructions per second.
In the following days, a list of the main indicators of processor development:
Processing. Device main frequency million instructions per second
4004 108 KHz 0.06
8080 2MHz 0.5
68000 8MHz 0.7
8086 8MHz 0.8
68000 16 MHz 1.3
68020 16 MHz 2.6
80286 12MHz 2.7
68030 16MHz 3.9
386 SX 20 MHz 6
68030 25 MHz 6.3
68030 40MHz 10
386 DX 33MHz 10
486 DX 25MHz 20
486 DX2-50 50MHz 35
486 DX4/100 100MHz 60
Pentium 66MHz 100
Pentium 133MHz 240
Pentium 233MHz MMX 435
Pentium Pro 200 MHz 440
Pentium II 233MHz 560
Pentium II 333MHz 770
1971: PASCAL language development Finish.
1972: Computers after 1972 are customarily called fourth-generation computers. Based on large-scale integrated circuits and later very large-scale integrated circuits. Computers are more powerful and smaller. People began to doubt whether computers could continue to shrink, especially whether the problem of heat generation could be solved? People began to discuss the development of fifth generation computers.
1972: The development of C language is completed. Its main designer is Dennis Ritche, one of the developers of UNIX systems. This is a very powerful language, especially popular for developing system software.
1972: Hewlett-Packard invented the first handheld calculator.
April 1, 1972: INTEL launches the 8008 microprocessor.
1972: ARPANET began to go global, and the INTERNET revolution began.
1973: The arcade game Pong is released and is widely welcomed. Invented by Nolan Bushnell, later the founder of Atari.
1974: CLIP-4, the first parallel computer architecture, is launched.
5. Computer technology is gradually entering its glory
Before this, computer technology was mainly developed in the fields of mainframes and minicomputers. However, with the development of very large scale integrated circuits and microprocessor technology, With the advancement of technology, the technical obstacles for computers to enter ordinary people's homes have been broken through layer by layer. Especially since INTEL released its microprocessor 8080 for personal computers, this wave has surged. At the same time, a large number of trendsetters in the information age have emerged, such as Steve Jobs, Bill Gates, etc., and their influence on the computer industry has continued to this day. Development also plays a crucial role. During this period, Internet technology and multimedia technology also achieved unprecedented development, and computers truly began to change people's lives.
April 1, 1974: INTEL releases its 8-bit microprocessor chip 8080.
December 1974: MITS releases the Altair 8800, the first commercial personal computer, worth $397 and with 256 bytes of memory.
1975: Bill Gates and Paul Allen completed the first BASIC program to run on MITS's Altair computer.
1975: IBM introduced its laser printer technology. In 1988, it launched its color laser printer to the market.
1975: Bill Gates and Paul Allen founded MicorSoft. Now the largest and most successful software company. Three years later, the income increased to 500,000 US dollars, and the number increased to 15 people. In 1992, it reached US$2.8 billion and had 10,000 employees. Its biggest breakthrough development was the development of an operating system for IBM's PC in 1981. Since then, it has had a huge impact on the computer industry.
1975: IBM 5100 released.
1976: Stephen Wozinak and Stephen Jobs founded Apple Computer. and introduced its Apple I computer.
1976: Zilog launches the Z80 processor. 8-bit microprocessor. CP/M is the operating system developed for it. Many famous software such as Wordstar and dBase II are based on this processor.
1976: 6502, an 8-bit microprocessor released for use with the Apple II computer.
1976: Cray 1, the first commercial supercomputer. It integrates 200,000 transistors and performs 150 million floating point operations per second.
May 1977: Apple II computer released.
1978: Commodore Pet released: with 8K RAM, cassette tape drive, 9-inch monitor.
June 8, 1978: INTEL releases its 16-bit microprocessor 8086.
However, because it was very expensive, the 8-bit 8088 was launched to meet the market's need for low-cost processors, and was adopted by IBM's first-generation PC. Its available clock frequencies are 4.77, 8, and 10MHz. There are approximately 300 instructions and 29,000 integrated transistors.
1979: The arcade game 'Space Invaders' is released, causing a sensation. It soon made similar consoles so popular that their revenue exceeded that of the American film industry.
1979: Jean Ichbiah completed the development of the Ada computer language.
June 1, 1979: INTEL released the 8-bit 8088 microprocessor purely to cater to the needs of low-cost computers.
1979: Commodore PET released a computer with a 1MHz 6502 processor, a monochrome display, and 8K memory, and more memory expansion could be purchased as needed.
1979: Low-density disk invented.
1979: Motorola releases the 68000 microprocessor. Mainly supplied to Apple's Macintosh, and the successor product 68020 is used in the Macintosh II model.
1979: IBM, seeing the personal computer market occupied by computer companies such as Apple, decided to develop its own personal computer. In order to launch its own products as quickly as possible, a lot of their work was to cooperate with third parties. Microsoft Corporation is responsible for the development of its operating system. Soon they launched the IBM-PC on August 12, 1981. But at the same time, it also provided enough fertilizer for Microsoft's subsequent rise.
1980~1990
1980: "As long as there is 1 megabyte of memory, DOS can perform its full performance." Microsoft said in the early days of developing DOS. What do you think when you hear this sentence today?
October 1980: MS-DOS/PC-DOS development begins. But Microsoft does not have its own independent operating system. They buy other people's operating systems and improve them. However, IBM found 300 bugs during testing. So they continued to improve, and the original DOS1.0 had 4,000 lines of assembler.
1981: Xerox begins work on the development of graphical user interfaces, icons, menus, and pointing devices (such as mice). As a result, the research results were used by Apple. Apple Computer later accused Microsoft of plagiarizing their design and developing the WINDOWS series of software.
1981: The 80186/80188 chip released by INTEL is rarely used because its registers are incompatible with others. But it uses direct memory access technology and time slice time sharing technology.
August 12, 1981: IBM releases its personal computer, priced at $2,880. The machine has 64K memory, monochrome display, optional cassette tape drive, two 160KB single-sided?/td>