History of computing: Difference between revisions

From Citizendium
Jump to navigation Jump to search
imported>Pat Palmer
(adding Intro section so can edit just that part, for now)
imported>Pat Palmer
(→‎Introduction: intermediate state)
Line 3: Line 3:
==Introduction==
==Introduction==


Until about the middle of the 20th century, the word '''[[computer]]''' generally meant a person who computes.  But during World War II, the first electronic '''computers''', machines that perform numerical calculations far faster than humans,  were developed by the British and U. S. governments as a result of secret military projects<ref name="Colossus">{{cite web|url=http://www.picotech.com/applications/colossus.html|title=Colossus: The World’s First Electronic Computer|publisher=Pico Technology|year=date_not_specified|accessdate=2007-04-24}}</ref><ref name="Eniac">{{cite web|url=http://www.seas.upenn.edu/~museum/|title=The ENIAC Museum Online|publisher=University of Pennsylvania School or Engineering and Applied Sciences (SEAS)|year=date_unspecified|accessdate=2007-04-23}}</ref>.  These first computers did not remain secret for long; they were adopted by private industry, and they quickly grew in usefulness while decreasing in size and cost.   Today, computers are ubiquitous household objects, perhaps unrecognized in the form of a tiny microprocessor embedded in a gadget such as a phone or a TV remote. Even defining the word ''computer'' may spark a debate, because so many different kinds of computers exist, and they are used for so many different kinds of activities.
Prior to World War II, the word '''[[computer]]''' generally meant a person who computes.  But during the war, the first electronic '''computers''', machines that perform numerical calculations far faster than humans, were developed. Most, but not all, were the  result of funding by the British and U. S. governments as a result of secret military projects<ref name="Colossus">{{cite web|url=http://www.picotech.com/applications/colossus.html|title=Colossus: The World’s First Electronic Computer|publisher=Pico Technology|year=date_not_specified|accessdate=2007-04-24}}</ref><ref name="Eniac">{{cite web|url=http://www.seas.upenn.edu/~museum/|title=The ENIAC Museum Online|publisher=University of Pennsylvania School or Engineering and Applied Sciences (SEAS)|year=date_unspecified|accessdate=2007-04-23}}</ref>.  After the war, computers were quickly adopted by private industry and grew rapidly in usefulness while decreasing in size and cost.


The human longing for mechanical help in performing complex computations existed for a long time before technology was advanced enough to realize a practical solution  But the electronic computer's rapid evolution forever changed [[science]], the [[military]], and [[business]].  The electronic computer has vastly expanded human ability to store and share [[information]]; as such, its invention may be a milestone for humanity on a par with the advent of [[writing]] and materials to write on (millennia ago)<ref name="Paper">{{cite web|url=http://www.wipapercouncil.org/invention.htm|title=The Invention of Paper Copyright © 2004 Wisconsin Paper Council|year=2004|accessdate=2007-04-24}}</ref>, or with the invention of the [[printing press]] (~1450)<ref name="PrintingPress">{{cite web|url=http://www.historyguide.org/intellect/press.html|title=The Printing Press by The History Guide copyright © 2000 Steven Kreis|year=2004|accessdate=2007-04-24}}</ref>.  Not all of this may be regarded as positive, however; the explosive intrusion in life of the computer in all its facets is sometimes referred to as the [[digital revolution]]<ref name="DigRef">{{cite web|url=http://web.mit.edu/transition/subs/demointro.html|title=The Digital Revolution, the Informed Citizen, and the Culture of Democracy by Henry Jenkins and David Thorburn (from the introduction to Democracy and New Media, Cambridge: MIT Press, 2003)|publisher=MIT Press|
The human longing for mechanical help in performing complex computations existed for a long time before technology was advanced enough to realize a practical solution  But the electronic computer's rapid evolution forever changed [[science]], the [[military]], and [[business]].  The electronic computer has vastly expanded human ability to store and share [[information]]; as such, its invention may be a milestone for humanity on a par with the advent of [[writing]] and materials to write on (millennia ago)<ref name="Paper">{{cite web|url=http://www.wipapercouncil.org/invention.htm|title=The Invention of Paper Copyright © 2004 Wisconsin Paper Council|year=2004|accessdate=2007-04-24}}</ref>, or with the invention of the [[printing press]] (~1450)<ref name="PrintingPress">{{cite web|url=http://www.historyguide.org/intellect/press.html|title=The Printing Press by The History Guide copyright © 2000 Steven Kreis|year=2004|accessdate=2007-04-24}}</ref>.  Not all of this may be regarded as positive, however; the explosive intrusion in life of the computer in all its facets is sometimes referred to as the [[digital revolution]]<ref name="DigRef">{{cite web|url=http://web.mit.edu/transition/subs/demointro.html|title=The Digital Revolution, the Informed Citizen, and the Culture of Democracy by Henry Jenkins and David Thorburn (from the introduction to Democracy and New Media, Cambridge: MIT Press, 2003)|publisher=MIT Press|
year=2003|accessdate=2007-04-24}}</ref>.
year=2003|accessdate=2007-04-24}}</ref>.
Today, computers are ubiquitous household objects, perhaps unrecognized in the form of a tiny microprocessor embedded in a gadget such as a phone or a TV remote. Even defining the word ''computer'' may spark a debate, because so many different kinds of computers exist, and they are used for so many different kinds of activities.


==Early devices (ancient times)==
==Early devices (ancient times)==

Revision as of 15:42, 3 May 2008

This article is developing and not approved.
Main Article
Discussion
Related Articles  [?]
Bibliography  [?]
External Links  [?]
Citable Version  [?]
 
This editable Main Article is under development and subject to a disclaimer.

Introduction

Prior to World War II, the word computer generally meant a person who computes. But during the war, the first electronic computers, machines that perform numerical calculations far faster than humans, were developed. Most, but not all, were the result of funding by the British and U. S. governments as a result of secret military projects[1][2]. After the war, computers were quickly adopted by private industry and grew rapidly in usefulness while decreasing in size and cost.

The human longing for mechanical help in performing complex computations existed for a long time before technology was advanced enough to realize a practical solution But the electronic computer's rapid evolution forever changed science, the military, and business. The electronic computer has vastly expanded human ability to store and share information; as such, its invention may be a milestone for humanity on a par with the advent of writing and materials to write on (millennia ago)[3], or with the invention of the printing press (~1450)[4]. Not all of this may be regarded as positive, however; the explosive intrusion in life of the computer in all its facets is sometimes referred to as the digital revolution[5].

Today, computers are ubiquitous household objects, perhaps unrecognized in the form of a tiny microprocessor embedded in a gadget such as a phone or a TV remote. Even defining the word computer may spark a debate, because so many different kinds of computers exist, and they are used for so many different kinds of activities.

Early devices (ancient times)

Long before the arrival of mechanical computing, ancient civilizations devised various methods to calculate and keep track of numbers.

Salamis Tablet (300 B.C.)

A very early counting device, the Salamis Tablet[6], was used by the Babylonians to track numbers in their society.

Abacus (300 B.C.)

The abacus, a mechanical aid to performing arithmetic, dates back many centuries and is still in use in various forms.

Mechanical computing (Renaissance to 1900)

Codex Madrid (Da Vinci)

On 13 February 1967, the "Codex Madrid", written by Leonardo Da Vinci, was discovered in the National Library of Spain by Dr. Roberto Guatelli[7]. Inside the Codex Madrid was a drawing for an elaborate mechanical computational device. Guatelli noticed that a similar construct appeared in Da Vinci's "Codex Atlanticus". A prototype of this machine was created in 1968, and was observed that it exhibited traits that of a ratio machine. One revolution of the first shaft(10^1) invoked ten revolutions of the second(10^2), repeating until the last shaft which rotated at a rate of ten to the power of 13. Whether this was a true computational device was under some debate. Previously been displayed at IBM, the exhibit was removed due to a nonconsensus, and is presumed to be in one of IBM's storage facilities.

Pascaline (1642)

An early mechanical computational device is the Pascaline, created by Blaise Pascal circa 1642.[8] The Pascaline performed simple addition and subtraction. The concept of the pascaline came about from the carrying of places by gear rotation. Functionally, the machine worked by increasing values on a single cog, which ranged from values 0 to 9. Upon the next rotation, a series of cogs would rotate the next gear over one iteration to read 1 while the first cog would reset back to 0.[9]

Charles Babbage's Difference Engine and Analytical Engines (early 1800's)

It would take Charles Babbage, born on December 26, 1791 and inducted as a Fellow of the Royal Society to develop the first real successful automatic calculating machine[10]. In 1821, Babbage developed the Difference Engine No. 1, which was a functional machine designed to compile mathematical tables based on polynomial caculation.[11]. The difference engine's physical algorithm was based on a mathematical technique known as the Method of Differences, which Babbage contributed work on. Unfortunately only a fragment of the machine would ever come to fruitition due to various financial disputes and accusations of fund mismanagement from the British Government. More importantly, the machine was never fully developed due to Babbage's realization of a more improved machine called the Analytical Engine. Functionally, the Analytical Engine was capable of various algorithmic operations that were broken down into basic algebraic operations. Two cards would be used to program the system: the first would detail what operations were required to be performed, and the second would contain the values to be operated on. In this sense, the Analytical Machine was much like a computer, having an input(the algorithm as described on a card), a processor(the machine), an output(the result), and memory(using a storage method--the cards themselves). Like the pascaline, both the Difference and Analytical Engines relied on series of cogs and gears to compute values.

Hollerith and punched cards (1884)

Herman Hollerith was born on February 29, 1860 in New York. In 1875 Hollerith attended the City College of New York, he graduated from the Columbia School of Mines in 1879 with an engineering degree.[12] After graduating, Hollerith took up work with the United States Census Bureau, and was appointed Chief Special Agent. Hollerith's contribution to computing was inspired by his work at the USCB, especially from Dr. John Shaw Billings who suggested that there should be a way to process the large amount of census data by some mechcanical means. In 1884, Hollerith worked to develop a way to tabulate census information through the use of punch cards. Eventually, he recognized that cards could be used as storage medium for census data. His experiments lead to a process by where a pin would go through a hole in the card to complete an electrical circuit. His system by which cards could be read and tabulated on a mechanical counter through a circuit completion was called the Hollerith Electric Tabulating System. By 1890, the machines were improved so that a simple keyboard could be used to tabulate data instead of entry by hand.

Prerequisites to the first electronic computers (early 1900's)

Switches

The discovery of electricity, along with the invention of electronic switches, including the solenoid and the switching vacuum tube, were necessary pre-requisites to the invention of electronic computers.

Switching algebra

Boolean algebra, an algebraic system consisting of only two values, was destined to become the basis for describing digital logic circuits used to build electronic computers. The realization that "switching algebra", as boolean algebra came to be known by computer designers, could be used to describe logic circuits was a major conceptual breakthrough first documented by Claude Shannon (1916-2001) in his 1938 MIT master's thesis[13]. Shannon's thesis created a stir in the world of electronics in 1938[14], though Shannon was better known in later years for founding the field of information theory.

The first electronic computers (1940's)

Early electronic computing research was carried out by academics, industrialists and entrepreneurs, but the bulk of the funding (with the exception of Konrad Zuse) was from the American and British government military during World War II. During the war, computing research was carried out under heavy secrecy, but after the war, computing was adopted rapidly by industry.

Zuse Z3 (1941)

German engineer Konrad Zuse built the Z1 computer between 1936 and 1938. German patent applications provide evidence of Zuse's development of a mechanical memory device in 1936, used in the Z1.[15]. Zuse built the Z2 sometime between 1936 and 1939, and the Z3 from 1939 to 1941[16]. The German government was not supportive of Zuse's work, and evidence of the Z3's existence was discovered by Allied forces only after the end of World War II. All photographs of the Zuse Z3 were destroyed by allied raids during the war[17]. Zuse's constructions incorporated advanced concepts, including the implementation of the binary numeral system. Having survived the war, Zuse built another computer in Switzerland, and later was the first designer to propose pipelining the computations of a computer processor. In 1949, Zuse formed Zuse KG, where he worked until 1966. Zuse's Z3 is now recognized as probably having been the first general purpose electro-mechanical computer.

Atanasoff pre-computer (1942)

Between 1937 and 1942, Dr. John V. Atanasoff and graduate student Clifford Berry, of Iowa State University, worked on a prototype electronic computer that introduced key design ideas but which never completely realized as a general purpose computing device. Some of Atanasoff's ideas may have been communicated to John Mauchly, who later assisted in the development of the ENIAC.

Colossus (1943)

Britain's Colossus project produced a series of about ten electronic computers used by British codebreakers to read encrypted German messages during World War II. The first Colossus prototype was initially completed by engineer Tommy Flowers in 1943 at the Post Office Research Station, Dollis Hill, with input from mathematician Max Newman and a few others. It used the binary numeral system for calculations, utilizing vacuum tubes and very fast optical punch tape readers. By 1944, the project moved to Bletchley Park and lasted until the end of the war. Shortly after, in 1946, Winston Churchill gave official orders to have the machines destroye

Harvard Mark I (1944)

For decades after World War II, it was widely believed that the IBM Automated Sequence Controlled Calculator(ASCC) (completed in 1944 and later called the Mark I) was the first electromechanical general-purpose computer[18]. The idea for the Harvard Mark I automatic digital calcuator was conceived by Howard H. Aiken, then a graduate student from Harvard University with a Ph. D. in theoretical physics. The machine was a hybrid of mechanical and electronic technology, performing calculations through a series of small gears, electro-mechanical counter wheels, and switches. Input occurred via punched cards, paper tape or through manually set switches to indicate the values to be processed. The output was generated by an electric typewriter or punched into additional cards. The successor to the Mark I, the Mark II, would still used relays, but also featured an electrical memory and a system of 'constant' values that were referenced during run-time.[19]

ENIAC (1946)

At the University of Pennsylvania, John Mauchly and J. Presper Eckert proposed the Electrical Numerical Integrator And Computer to the U.S. Army Ordnance Department's Ballistics Research Laboratory in 1943, and then served as its main designers until construction was finished in 1946. It was a military project justified by a need to compute ballistic trajectories, and was one of the earliest general-purpose, programmable electronic computers[20]. ENIAC's computations used the decimal numeral system, instead of the binary numeral system used by most subsequent digital computers. The ENIAC was not able to store its own program in memory; it had to be programmed by setting switches on function tables and by changing the wiring. Considerable human effort was required to reprogram it.

UNIVAC and EDVAC (late 1940's)

The designers of ENIAC jointly formed the Eckert-Mauchly Computer Corporation in 1946, which was bought by Remington Rand in 1950. In 1951, this company delivered the first U. S. commercial computer, called the UNIVAC, to the United States Census Bureau. The UNIVAC was a stored-program computer, like its non-commercial sister the EDVAC. The Electronic Discrete Variable Automatic Computer was funded by the U.S. Army Ballistics Research Laboratory and was built at the Aberdeen Proving Ground in Maryland. EDVAC was build by the University of Pennsylvania's ENIAC designers Eckert and Mauchly, and by John von Neumann and some others. The EDVAC was the first computer to implement the stored-program concept. The idea was first published in von Neumann's 1945 report First Draft of a Report on the EDVAC[21]. Although its design predates the UNIVAC, the EDVAC did not become fully operational until 1952. Competing fiercely with IBM, the company eventually built 46 of the earliest commercial computer systems.

1950s: Hardware and the first compiler

Assemblers

Need a paragraph here with pointers to existing or future articles about specific project, products, companies or technologies.

First compiler

Need a paragraph here with pointers to existing or future articles about specific project, products, companies or technologies.

1960s: Batch operating systems

Batch operating systems

Need a paragraph here with pointers to existing or future articles about specific project, products, companies or technologies.

Multics

Need a paragraph here with pointers to existing or future articles about specific project, products, companies or technologies.

1970s: Networks, better software, and smaller hardware

early networks

Need a paragraph here with pointers to existing or future articles about specific project, products, companies or technologies.

time-sharing operating systems

Need a paragraph here with pointers to existing or future articles about specific project, products, companies or technologies.

mini-computers

Need a paragraph here with pointers to existing or future articles about specific project, products, companies or technologies.

computer programming languages

Need a paragraph here with pointers to existing or future articles about specific project, products, companies or technologies.

1980s: Networks and personal computers, oh my!

personal computers

Apple Inc. dukes it out with the IBM compatible PC and Microsoft.

The internet

Need a paragraph here with pointers to existing or future articles about specific project, products, companies or technologies.

  • DNS
  • electronic mail
  • newsgroups
  • FTP (early file sharing)

1990s: WWW, and several software revolutions

Dial-up internet access for the masses

Many small companies, and some large ones such as Compuserve, sell dial-up internet access accounts to the public. Suddenly, anyone could get online and have email, although email systems were not always compatible with each other. Combined with the availability of home (personal) computers, internet usage grew astronomically.

The web

In 1992, Tim Berners-Lee published the HTTP protocol and the HTML language, and the first web browser (Mosaic) became available. These led to what is now called the World_wide_web (or just WWW).

Object-oriented programming goes wild

Experimental computer programming languages based on object-oriented concepts were developed decades earlier, but in 1992, Sun Microsystems began developing its Java platform which soon took the programming world by storm.

Open-source software shakes things up

After a decade of failed attempts to make the popular Unix operating system run on low-cost Wintel hardware, Linus Torvalds released the Linux operating system. Seeds planted in the 1980's by Richard Stallman's Open Software Foundation finally took root, and the open source software movement really took off, having a disruptive (and arguably positive) effect on the entire software economy.

XML, the universal translator

Another step on the road to operating system (platform) independence, this text-based standard for self-describing data had a huge impact. As major programming languages such as Java and C# developed XML parsers, these languages could finally exchange information with each other, and much more.

MP3: Honey, I shrunk the music.

This compression algorithm led to digital music players and file sharing and much more. (need more here!)

after 2000

Wireless networking

Need a paragraph here with pointers to existing or future articles about specific project, products, companies or technologies.

Google, Maps, Mashups, Amazon, Yahoo

Need a paragraph here with pointers to existing or future articles about specific project, products, companies or technologies.

Global Position System (GPS)

Need a paragraph here with pointers to existing or future articles about specific project, products, companies or technologies.

Social networking

Need a paragraph here with pointers to existing or future articles about specific project, products, companies or technologies. Myspace, Facebook, LinkedIn. Also, Digg, Del.icio.us, StumbleUpon, etc.

Telephony on the internet (VOIP)

Need a paragraph here with pointers to existing or future articles about specific project, products, companies or technologies. Skype, Vonage, etc.

Virtualization: one computer, many operating systems

Need a paragraph here with pointers to existing or future articles about specific project, products, companies or technologies.

Famous people in history of computing

For now, see this list of people who made conceptual breakthroughs in computer science.

Famous concepts in history of computing

For now, see this list of seminal concepts in computer science.


References

  1. Colossus: The World’s First Electronic Computer. Pico Technology (date_not_specified). Retrieved on 2007-04-24.
  2. The ENIAC Museum Online. University of Pennsylvania School or Engineering and Applied Sciences (SEAS) (date_unspecified). Retrieved on 2007-04-23.
  3. The Invention of Paper Copyright © 2004 Wisconsin Paper Council (2004). Retrieved on 2007-04-24.
  4. The Printing Press by The History Guide copyright © 2000 Steven Kreis (2004). Retrieved on 2007-04-24.
  5. The Digital Revolution, the Informed Citizen, and the Culture of Democracy by Henry Jenkins and David Thorburn (from the introduction to Democracy and New Media, Cambridge: MIT Press, 2003). MIT Press (2003). Retrieved on 2007-04-24.
  6. The Abacus:A Brief History. Retrieved on 2007-04-24.
  7. Kaplan, Erez. 1996. The Controversial Replica of Leonardo da Vinci's Adding Machine. Retrieved on 2007-04-30.
  8. Abernethy, Ken and Allen, Tom. 2004. Early Calculating and Computing Machines: From the Abacus to Babbage. Furman University. Retrieved on 2007-04-30.
  9. A simplified example of the functionality of the Pascaline. La Machine de Pascal:la pascaline (French: The Machine of Pascal: The Pascaline (literal)). Retrieved on 2007-05-04.
  10. Lemelson-MIT Program, Inventor of the Week Archive. MIT. (February 2003). Retrieved on 2007-05-14.
  11. Dunne, Paul E.. History of Computation. Retrieved on 2007-05-14.
  12. O'Connor, J. J. and Robertson, E. F. (July 1999). Hollerith Biography. School of Mathematics and Statistics University of St. Andrews, Scotland. Retrieved on 2007-05-14.
  13. ``A Symbolic Analysis of Relay and Switching Circuits, MIT master's thesis published in T.A.I.E.E. Vol. 57 (1938), pp. 713-723. Transactions American Institute of Electrical Engineers (1938). Retrieved on 2007-05-12.
  14. "Claude Shannon" from Professor Ray C. Dougherty's course notes (V61.0003) Communication: Men, Minds, and Machines (Fall, 1996). Microsoft Corporation (1996). Retrieved on 2007-05-12.
  15. (German) Zuse, Konrad: Verfahren zur selbsttätigen Durchführung von Rechnungen mit Hilfe von Rechenmaschinen. Patentanmeldung Z 23 139 / GMD Nr. 005/021 / Jahr 1936. Konrad Zuse: Bibliography.. Retrieved on 2007-05-16.
  16. Zuse, Horst. The Life and Work of Konrad Zuse. Wimborne Publishing LTD and Maxfield & Montrose Interactive Inc. Retrieved on 2007-05-16.
  17. (1987) "Portraits in Silicon" by Robert Slater, ch. 5, p. 43. The MIT Press. 
  18. IBM Archives:IBM's ASCC (a.k.a. The Harvard Mark I). IBM. Retrieved on 2007-05-15.
  19. Lemelson-MIT Program, Inventor of the Week Archive. MIT. (October 2002). Retrieved on 2007-05-15.
  20. "The Eniac Museum Online", University of Pennsylvania School of Engineering Arts and Sciences. University of Pennsylvania. Retrieved on 2007-05-12.
  21. "First Draft of a Report on the EDVAC" (PDF format) by John von Neumann, Contract No.W-670-ORD-4926, between the United States Army Ordnance Department and the University of Pennsylvania. Moore School of Electrical Engineering, University of Pennsylvania, June 30, 1945. The report is also available in Stern, Nancy (1981). From ENIAC to UNIVAC: An Appraisal of the Eckert-Mauchly Computers. Digital Press.