When was the computer invented?

The first computer (ENIAC) was born in the United States in February 1946. It was the American mathematician von Neumann who proposed program storage. With funding from the U.S. Army Department, he began the development of ENIAC in 1943 and completed it in 1946; 1. The birth of mechanical computers in Western Europe, from the Middle Ages to the Renaissance. The great social changes during this period greatly promoted the development of natural science and technology, and people's creativity, which had long been suppressed by theocracy, was unprecedentedly released. Among these sparks of thought and creativity, building a machine that can help people perform calculations is the most dazzling and eye-catching one. Since then, one scientist after another has worked tirelessly to realize this great dream. However, due to the technological level at that time, most experimental creations ended in failure, which also revealed the same fate of pioneers: they often did not see the results of their efforts before they fell. When later generations enjoy these sweet fruits, they can often taste the intertwined taste of sweat and tears... 1614: Scotsman John Napier (1550-1617) published a paper in which he mentioned that he invented a An exquisite device that can perform four arithmetic operations and square root operations. 1623: Wilhelm Schickard (1592-1635) made a "calculating clock" that could perform addition and subtraction operations within 6 digits and output the answer through a bell. The device operates by turning a gear. 1625: William Oughtred (1575-1660) invented the slide rule. 1668: The Englishman Samuel Morl (1625-1695) built a non-decimal adding device suitable for calculating coins. 1671: German mathematician Gottfried Leibniz designed a calculation tool that could perform multiplication operations and the final answer could be up to 16 digits long. 1822: The Englishman Charles Babbage (1792-1871) designed the Difference Engine and the Analytical Engine. His design theory was very advanced, similar to electronic computers a hundred years later. In particular, the design of using cards to input programs and data was adopted by later generations. . 1834: Babbage envisioned building a general-purpose analytical engine that would store programs and data in read-only memory (punched cards). Babbage continued his research work in the following years, and in 1840 he increased the number of operating bits to 40, and basically realized the idea of ??a control center (CPU) and a stored program, and the program could jump according to conditions, and could Perform general addition within a few seconds and multiplication and division within a few minutes. 1848: British mathematician George Boole founded binary algebra, paving the way for the development of modern binary computers nearly a century in advance. 1890: The U.S. Census Bureau wants a machine to help make the census more efficient. Herman Hollerith (whose company later became IBM) borrowed Babbage's invention and designed a machine using punched cards to store data. The result was accurate demographic data in just 6 weeks (it would have taken about 10 years using manual methods). 1896: Herman Hollerith founded the predecessor to IBM. 2. The advent of electronic computers. A hundred years after the birth of mechanical calculators, with the rapid advancement of electronic technology, computers began a real transition from machinery to the electronic era. Electronic devices gradually evolved into the main body of computers, and mechanical Components gradually assume a subordinate position. When the status of the two was transformed, computers also officially began to transform from quantity to quality, which led to the official advent of electronic computers. The following are the main events of this transitional period: 1906: American Lee De Forest invented the electronic tube, laying the foundation for the development of electronic computers. February 1924: IBM was established, and an epoch-making company was born. 1935: IBM launches the IBM 601 machine. It was a punched card computer that could do multiplication in one second.

This machine played an important role in both natural science and commercial applications, and approximately 1,500 units were manufactured. 1937: Alan M. Turing (1912-1954) of the University of Cambridge in the United Kingdom published his paper and proposed a mathematical model known as the "Turing machine" by later generations. 1937: George Stibitz of Bell Laboratories demonstrates a binary representation of a relay. Although it was just a showpiece, it was the first binary electronic computer. January 1940: Samuel Williams and Stibitz of Bell Labs successfully built a computer that could perform complex calculations. The machine made extensive use of relays, borrowed some telephone technology, and adopted advanced coding technology. Summer 1941: Atanasoff and his student Berry completed a computer that could solve linear algebraic equations, named "ABC" (Atanasoff-Berry Computer). It used capacitors as memory and punched cards as auxiliary memory. The holes were actually "burned holes" "Up, the clock frequency is 60Hz, and it takes one second to complete an addition operation. January 1943: The Mark I automatic sequence control computer was successfully developed in the United States. The entire machine is 51 feet long, weighs 5 tons, and has 750,000 parts. The machine uses 3304 relays and 60 switches as mechanical read-only memory. Programs are stored on paper tape, and data can come from paper tape or card readers. The Mark I was used to calculate ballistic firepower tables for the U.S. Navy. September 1943: Williams and Stibitz completed the "Relay Interpolator", a computer later named the "Model II Re-lay Calculator". This was a programmable computer that also used paper tape to enter programs and data. It runs more reliably, each number is represented by 7 relays and can perform floating point operations. 1946: ENIAC (Electronic Numerical Integrator And Computer) was born, the first truly digital electronic computer. Development began in 1943 and was completed in 1946. The persons in charge were John W. Mauchly and J. Presper Eckert. It weighed 30 tons, used 18,000 electron tubes, and had a power of 25 kilowatts. It was mainly used for ballistic calculations and the development of hydrogen bombs. 3. Development of transistor computers Although computers in the vacuum tube era have entered the category of modern computers, their large size, high energy consumption, many faults, and high price have restricted their popularity and application. It wasn't until the invention of the transistor that electronic computers found their starting point to take off. 1947: William B.Shockley, John Bardeen and Walter H.Brattain of Bell Laboratories invented the transistor, ushering in a new era of electronics. 1949: Wilkes and his team at Cambridge University built a computer that could store programs, and the input and output device was still paper tape. 1949: EDVAC (Electronic Discrete Variable Automatic Computer) - the first computer to use magnetic tape. This was a breakthrough that allowed programs to be stored on tape multiple times. This machine was proposed by John von Neumann. 1950: Yoshiro Nakamats of Tokyo Imperial University invented the floppy disk, and its sales rights were obtained by IBM. This ushered in a new era of storage. 1951: Grace Murray Hopper completes a high-level language compiler. 1951: UNIVAC-1 - the first commercial computer system was born, designed by J. Presper Eckert and John Mauchly. Being used for the census by the U.S. Census Department marked the entry of computers into the era of commercial application.

1953: Magnetic core memory is developed. 1954: IBM's John Backus and his research team began to develop FORTRAN (FORmula TRANslation), which was completed in 1957. This is a high-level computer language suitable for scientific research. 1957: IBM successfully developed the first dot matrix printer. 4. Integrated circuits pave the way for modern computers. Although the use of transistors has greatly reduced the size, price, and failures of computers, it is still far from the actual requirements of users, and various industries have also had a greater impact on computers. To meet the demand, it is imperative to produce machines with stronger performance, lighter weight and lower price. The invention of the integrated circuit solved this problem. High integration not only reduces the size of the computer, but also speeds up the speed and reduces faults. From then on, people began to create revolutionary microprocessors. September 12, 1958: Under the leadership of Robert Noyce (founder of Intel), the integrated circuit was born, and the microprocessor was soon invented. However, because Japan borrowed technology from Japanese companies when inventing microprocessors, Japan did not recognize its patents because Japan did not receive the benefits it deserved. It took 30 years for Japan to admit it, so that Japanese companies could get part of the profits from it. But by 2001, the patent expired. 1959: Grace Murray Hopper began to develop the COBOL (COmmon Business-Oriented Language) language, which was completed in 1961. 1960: ALGOL - the first structured programming language launched. 1961: IBM's Kenneth Iverson introduces the APL programming language. 1963: DEC launches the first minicomputer, the PDP-8. 1964: IBM releases the PL/1 programming language. 1964: Released the first series of IBM 360 compatible machines. 1964: DEC releases the PDB-8 minicomputer. 1965: Moore's Law is announced, the number of transistors in a processor doubles every 18 months, and the price drops by half. 1965: Lofti Zadeh creates fuzzy logic to deal with approximation problems. 1965: Thomas E. Kurtz and John Kemeny completed the development of the BASIC (Beginner’s All-purpose Symbolic In-struction Code) language. It is especially suitable for computer education and beginners, and has been widely promoted. 1965: Douglas Englebart proposed the idea of ??a mouse, but did not conduct further research until 1983 when it was widely adopted by Apple Computer Company. 1965: The first supercomputer CD6600 was successfully developed. 1967: Niklaus Wirth began developing the PASCAL language, which was completed in 1971. 1968: Robert Noyce and several of his friends founded Intel. 1968: Seymour Paper and his research group developed the LOGO language at MIT. 1969: The ARPANet (Advanced Research Projects Agency Network) project was launched, which was the prototype of the modern Internet. April 7, 1969: The first network protocol standard RFC is launched. 1970: The first RAM chip was launched by Intel with a capacity of 1KB. 1970: Ken Thomson and Dennis Ritchie begin developing the UNIX operating system. 1970: The Forth programming language is developed. 1970: ARPANet, the prototype of the Internet, was basically completed and began to be opened to non-military sectors.

November 15, 1971: Marcian E.Hoff successfully developed the first microprocessor 4004 at Intel, which contained 2,300 transistors, a word length of 4 bits, a clock frequency of 108KHz, and executed 60,000 instructions per second. 1972: Computers after 1972 are customarily called fourth-generation computers. Based on large scale integrated circuits and later very large scale integrated circuits. Computers during this period were more powerful and smaller. At this time, people began to doubt whether computers could continue to shrink, especially whether the heat problem could be solved. At the same time, discussions began about the development of fifth-generation computers. 1972: The C language was developed. Its main designer is Dennis Ritche, one of the developers of UNIX systems. It's a very powerful language and extremely well-loved. 1972: Hewlett-Packard invented the first handheld calculator. April 1, 1972: Intel launches the 8008 microprocessor. 1972: ARPANet begins to go global, and the Internet revolution begins. 1973: The arcade game Pong is released and becomes widely popular. The inventor is Nolan Bushnell (the founder of Atari). 1974: CLIP-4, the first parallel computer architecture, is launched. 5. Contemporary computer technology is gradually entering its glory. Before this, it should be said that computer technology was still mainly focused on the development of mainframes and minicomputers. With the advancement of VLSI and microprocessor technology, the technical barriers to computers entering ordinary people's homes have gradually been overcome. Especially after Intel released its microprocessor 8080 for individual users, this wave finally surged, and it also gave birth to a large number of trendsetters in the information age, such as Stephen Jobs. ), Bill Gates (Bill? 6? 1 Gates), etc., they still play a decisive role in the development of the entire computer industry. During this period, Internet technology and multimedia technology also received unprecedented application and development, and computers truly began to change our lives. April 1, 1974: Intel releases its 8-bit microprocessor chip, the 8080. 1975: Bill Gates and Paul Allen completed the first BASIC program to run on MIT's Altair computer. 1975: Bill Gates and Paul Allen founded Microsoft (now the world's largest and most successful software company). Three years later, revenue increased to $500,000 and the number of employees increased to 15. $2.8 billion in 1992, 10,000 employees. In 1981, Microsoft developed an operating system for IBM's PC, establishing its leadership position in the field of computer software. 1976: Stephen Wozinak and Stephen Jobs founded Apple Computer and launched its Apple I computer. June 8, 1978: Intel releases its 16-bit microprocessor 8086. In June 1979, the quasi-16-bit 8088 was launched to meet the market's demand for low-cost processors, and was adopted by IBM's first generation PC. The processor has clock frequencies of 4.77MHz, 8MHz and 10MHz, has approximately 300 instructions, and integrates 29,000 transistors. 1979: Low-density floppy disk was born. 1979: IBM saw the personal computer market being occupied by computer companies such as Apple and decided to develop its own personal computer. In order to launch its own products as soon as possible, IBM handed over a lot of work to third parties (among them Microsoft, which undertook the development of the operating system, which also laid the foundation for Microsoft's subsequent rise). On August 12, 1981 The IBM-PC was launched. 1980: "As long as 1 megabyte of memory is enough for DOS to perform," Microsoft said in the early days of developing DOS.

What do you think when you hear this sentence today? 1981: Xerox begins working on the development of graphical user interfaces, icons, menus, and pointing devices (such as mice). As a result, the research results were used by Apple, and Apple Computer later accused Microsoft of plagiarizing their design and developing the Windows series of software. August 12, 1981: MS-DOS 1.0 and PC-DOS 1.0 are released. Microsoft was commissioned by IBM to develop the DOS operating system. They purchased a program called 86-DOS from Tim Paterson and improved it. The version sold by IBM is called PC-DOS, and the version sold by Microsoft is called MS-DOS. The cooperation between Microsoft and IBM lasted until DOS 5.0 in 1991. The original DOS 1.0 was very crude, with only one root directory on each disk and no support for subdirectories. This was not changed until version 2.0 in March 1983. MS-DOS was an operating system compatible with IBM-PC until 1995. After Windows 95 was launched and quickly occupied the market, its last version was named DOS 7.0. 1982: The Internet based on the TCP/IP protocol began to take shape. February 1982: 80286 was released, with the clock frequency increased to 20MHz, a protected mode added, access to 16MB of memory, support for more than 1GB of virtual memory, execution of 2.7 million instructions per second, and integration of 134,000 transistors. Spring 1983: The IBM XT machine was released, adding a 10MB hard drive, 128KB of memory, a floppy drive, a monochrome monitor, a printer, and the ability to add an 8087 digital coprocessor. The price at the time was $5,000. March 1983: MS-DOS 2.0 and PC-DOS 2.0 added a management form similar to UNIX hierarchical directories. 1984: DNS (Domain Name Server) domain name server was released, and more than 1,000 hosts were running on the Internet. The end of 1984: Compaq began to develop an IDE interface, which could transmit data at a faster speed and was adopted by many peers. Later, an EIDE interface with better performance was developed based on this. 1985: Philips and SONY collaborate to launch CD-ROM drives. October 17, 1985: 80386 DX launched. The clock frequency reaches 33MHz, can address 1GB of memory, can execute 6 million instructions per second, and integrates 275,000 transistors. November 1985: Microsoft Windows is released. The operating system required the support of DOS and was similar to the operating interface of Apple, so it was sued by Apple. The lawsuit was not terminated until August 1997. December 1985: MS-DOS 3.2 and PC-DOS 3.2 are released. This was the first system to support 3.5-inch disks, but only up to 720KB, and version 3.3 only supported 1.44MB. 1987: Microsoft Windows 2.0 is released. 1988: EISA standards established. 1989: Tim Berners-Lee of CERN creates the prototype of the World Wide Web. Through hypertext links, novices can easily browse the Internet. This has greatly promoted the development of the Internet. March 1989: The EIDE standard was established, can support hard drives exceeding 528MB, can reach a transfer speed of 33.3MB/s, and is adopted by many CD-ROMs. April 10, 1989: 80486 DX released. The processor integrated 1.2 million transistors, and its successors were clocked at 100MHz. November 1989: Sound Blaster Card (sound card) released. May 22, 1990: Microsoft releases Windows 3.0, compatible with MS-DOS mode.

November 1990: The first generation of MPC (Multimedia Personal Computer Standard) was released. The standard requires a processor of at least 80286/12MHz (later increased to 80386SX/16MHz) and an optical drive, with a transfer rate of at least 150KB/sec. 1991: ISA standard published. June 1991: MS-DOS 5.0 and PC-DOS 5.0 are released. In order to promote the development of OS/2, Bill Gates said that DOS5.0 is the end of DOS and will no longer spend energy on it in the future. This version exceeds the base memory limit of 640KB. This version also marked the end of Microsoft's cooperation with IBM on DOS. 1992: Windows NT is released, with 2GB of addressable memory. April 1992: Windows 3.1 is released. 1993: The Internet begins commercial operation. 1993: The classic game Doom is released. March 22, 1993: Pentium was released. The processor integrated more than 3 million transistors, the core frequency of the early version was 60-66MHz, and it executed 100 million instructions per second. May 1993: MPC Standard 2 was released, requiring a CD-ROM transfer rate of 300KB/s and 15 frames per second in a 320 × 240 window. March 7, 1994: Intel released the 90-100MHz Pentium processor. 1994: Netscape 1.0 browser released. 1994: The famous real-time strategy game Command & Conquer (Command & Conquer) was released. March 27, 1995: Intel releases the 120MHz Pentium processor. June 1, 1995: Intel releases the 133MHz Pentium processor. August 23, 1995: Windows 95, a pure 32-bit multitasking operating system, was released. This operating system is greatly different from the previous version and is completely separated from MS-DOS, but it retains the DOS mode to take care of user habits. Windows 95 was a huge success. November 1, 1995: Pentium Pro was released, with a main frequency of up to 200MHz, the ability to execute 440 million instructions per second, and the integration of 5.5 million transistors. December 1995: Netscape releases its JavaScript. January 1996: Netscape Navigator 2.0 is released. This is the first browser to support JavaScript. January 4, 1996: Intel released the 150-166MHz Pentium processor, integrating 3.1-3.3 million transistors. 1996: Windows 95 OSR2 was released, which fixed some bugs and expanded some functions. 1997: Famous game software such as Heft Auto, Quake 2 and Blade Runner were released, and led to the rapid rise of 3D graphics accelerator cards. January 8, 1997: Intel releases the Pentium MMX CPU, which enhances the processor's gaming and multimedia capabilities. April 1997: IBM's Deep Blue computer defeated the human world chess champion Garry Kasparov. May 7, 1997: Intel releases Pentium II, adding more instructions and Cache. June 2, 1997: Intel releases the 233MHz Pentium MMX. February 1998: Intel released the 333MHz Pentium II processor, manufactured using a 0.25 μm process, which increased speed while reducing heat generation.

June 25, 1998: Microsoft released Windows 98. Some people tried to dismember Microsoft. Microsoft responded that this would harm the national interests of the United States. January 25, 1999: Linux Kernel 2.2.0 is released, and people have high hopes for it. February 22, 1999: AMD releases the K6-3 400MHz processor. July 1999: Pentium III was released. The initial clock frequency was above 450MHz, the bus speed was above 100MHz, manufactured using a 0.25μm process, supported the SSE multimedia instruction set, and integrated a level 2 cache of more than 512KB. October 25, 1999: The Pentium III processor, codenamed Coppermine, was released. The core size of the Coppermine chip manufactured using the 0.18 μm process has been further reduced. Although it integrates a 256KB full-speed On-Die L2 Cache and has 28 million built-in transistors, its size is only 106 square millimeters. March 2000: Intel released a new generation of Celeron processor code-named "Coppermine 128". The most significant difference between the new Celeron processor and the old Celeron processor is that it uses the same Coppermine core and the same FC-PGA packaging method as the new P III processor, and also supports the SSE multimedia extended instruction set.