History of computing: Difference between revisions

From Citizendium
Jump to navigation Jump to search
imported>Pat Palmer
mNo edit summary
 
(71 intermediate revisions by 8 users not shown)
Line 1: Line 1:
{{subpages}}
{{subpages}}


Until about the middle of the 20th century, the word '''[[computer]]''' generally meant a person who computes.  The earliest reference to the term, in 1631, comes from the French word with the same meaning, derived from the Latin word ''computare'' meaning "to count, to sum up." The word is formed from two roots: ''com-'' meaning "with", and ''+putare'' meaning "to reckon" (originally "to prune.")<ref>"compute", {{cite web|url=http://www.etymonline.com/index.php?term=compute|title=Online Etymology Dictionary|accessdate=2007-04-24}}</ref>.  According to the first definition given by the [[Oxford English Dictionary]], a '''computer''' is: "One who computes; a calculator, reckoner; spec. a person employed to make calculations in an observatory, in surveying, etc."
Writing about the '''history of computing''' is challenging because of the complexity of any one [[computer]], the speed with which computer technology has evolved, and the many different types of computers that have been builtFurther, since computing reaches into so many different industries (such as telephony, automotives, or cameras) and has spawned a huge industry for the making of hardware and software for computers, it is difficult to know where such a history should stop.


<!--
This article is organized in sequential order, to present the major events in the development of general-purpose computing devices, including the basic evolution of their hardware components and software concepts. The article is organized by decade. Technologies appear, not in the decade in which they were first imagined or in which experimental versions became available, but in the decade in which they became widely accepted.
Timeline temporarily removed until fixed
 
{|border="1" frame=box rules=none align="right" cellpadding="0" cellspacing="0" width="400" style="margin: 0 0 0 0; background:#e5e5e5;"
{{TOC|right}}
|align="center" colspan="6" width=100%|History of Computing Timeline
 
|-
==Early devices (ancient times) ==
|width=100% colspan="6"|&nbsp;
|-
|align="center" width=40|+  
|align="center" width=100|?? BC
|<hr width=40/>
|align="left" colspan="3"|&nbsp;Slamis Tablet
|-
!style="align: center;"|
{|border="1" frame=void rules=all width=100% cellpadding="0" cellspacing="0" align="left" style="margin: 0 0 0 0; background:#e5e5e5;"
|width=50% height=10px|&nbsp;
|width=50%|&nbsp;
|}
|colspan="5" align="center"|
|-
|width=40|
{|border="1" frame=void rules=all width=100% cellpadding="0" cellspacing="0" align="center" style="margin: 0 0 0 0; background: #e5e5e5;"
|width=50% height=40px|&nbsp;
|<hr width=100%/>
|}
|align="center" width=100|300 BC-500 AD
|<hr width=40/>
|align="left" colspan="3"|&nbsp;Roman Hand Abacus
|-
!style="align: center;"|
{|border="1" frame=void rules=all width=100% cellpadding="0" cellspacing="0" align="left" style="margin: 0 0 0 0; background: #e5e5e5; height=100%;"
|width=50% height=10px|&nbsp;
|width=50%|&nbsp;
|}
|colspan="5" align="center"|
|-
|width=40|
{|border="1" frame=void rules=all width=100% cellpadding="0" cellspacing="0" align="center" style="margin: 0 0 0 0; background: #e5e5e5;"
|width=50% height=40px|&nbsp;
|<hr width=100%/>
|}
|align="center" width=100|~1200 AD
|<hr width=40/>
|align="left" colspan="3"|&nbsp;Modern Chinese Abacus
|-
!style="align: center;"|
{|border="1" frame=void rules=all width=100% cellpadding="0" cellspacing="0" align="left" style="margin: 0 0 0 0; background: #e5e5e5;"
|width=50% height=10px|&nbsp;
|width=50%|&nbsp;
|}
|colspan="5" align="center"|
|-
|width=40|
{|border="1" frame=void rules=all width=100% cellpadding="0" cellspacing="0" align="center" style="margin: 0 0 0 0; background: #e5e5e5;"
|width=50% height=40px|&nbsp;
|<hr width=100%/>
|}
|align="center" width=100|??
|<hr width=40/>
|align="left" colspan="3"|&nbsp;Leonardo Da Vinci writes the "Codex Madrid"
|-
!style="align: center;"|
{|border="1" frame=void rules=all width=100% cellpadding="0" cellspacing="0" align="left" style="margin: 0 0 0 0; background: #e5e5e5;"
|width=50% height=10px|&nbsp;
|width=50%|&nbsp;
|}
|colspan="5" align="center"|
|}
-->
==Early devices (ancient times)==
Long before the arrival of ''mechanical'' computing, ancient civilizations devised various methods to calculate and keep track of numbers.  
Long before the arrival of ''mechanical'' computing, ancient civilizations devised various methods to calculate and keep track of numbers.  


Line 83: Line 19:


==Mechanical computing (Renaissance to 1900)==
==Mechanical computing (Renaissance to 1900)==
The human longing for mechanical help in performing complex computations existed for a long time before technology was advanced enough to realize a practical solution.  These are some of the attempts to create computing machines before technology was sufficiently advance to make that very feasible.


'''Codex Madrid (Da Vinci)'''
'''Codex Madrid (Da Vinci)'''
Line 90: Line 28:
'''Pascaline (1642)'''
'''Pascaline (1642)'''


An early mechanical computational device is the [[Pascaline]], created by [[Blaise Pascal]] circa 1642.<ref>Abernethy, Ken and Allen, Tom.  2004. {{cite web|url=http://cs.furman.edu/digitaldomain/focus/history/earlyhist2.html|title=Early Calculating and Computing Machines: From the Abacus to Babbage|publisher=Furman University|accessdate=2007-04-30}}</ref>  The Pascaline performed simple addition and subtraction.  The concept of the pascaline came about from the carrying of places by gear rotation.  Functionally, the machine worked by increasing values on a single cog, which ranged from values 0 to 9.  Upon the next rotation, a series of cogs would rotate the next gear over one iteration to read 1 while the first cog would reset back to 0.<ref>A simplified example of the functionality of the Pascaline.  {{cite web|url=http://perso.orange.fr/therese.eveilleau/pages/truc_mat/textes/pascaline.htm|title=La Machine de Pascal:la pascaline (French: The Machine of Pascal: The Pascaline (literal))|accessdate=2007-05-04}}</ref>
An early mechanical computational device is the [[Pascaline]], created by [[Blaise Pascal]] circa 1642.<ref>Abernethy, Ken and Allen, Tom.  2004. {{cite web|url=http://cs.furman.edu/digitaldomain/focus/history/earlyhist2.html|title=Early Calculating and Computing Machines: From the Abacus to Babbage|publisher=Furman University|accessdate=2007-04-30}}</ref>  The Pascaline performed simple addition and subtraction.  The concept of the Pascaline came about from the carrying of places by gear rotation.  Functionally, the machine worked by increasing values on a single cog, which ranged from values 0 to 9.  Upon the next rotation, a series of cogs would rotate the next gear over one iteration to read 1 while the first cog would reset back to 0.<ref>A simplified example of the functionality of the Pascaline.  {{cite web|url=http://perso.orange.fr/therese.eveilleau/pages/truc_mat/textes/pascaline.htm|title=La Machine de Pascal:la pascaline (French: The Machine of Pascal: The Pascaline (literal))|accessdate=2007-05-04}}</ref>


'''Charles Babbage's Difference Engine and Analytical Engines (early 1800's)'''
'''Charles Babbage (early 1800's)'''


It would take [[Charles Babbage]], born on December 26, 1791 and inducted as a Fellow of the Royal Society to develop the first real successful automatic calculating machine<ref>{{cite web|url=http://web.mit.edu/invent/iow/babbage.html|title=Lemelson-MIT Program, Inventor of the Week Archive|date=February 2003|accessdate=2007-05-14|publisher=MIT.}}</ref>.  In 1821, Babbage developed the Difference Engine No. 1, which was a functional machine designed to compile mathematical tables based on polynomial caculation.<ref>{{cite web|url=http://www.csc.liv.ac.uk/~ped/teachadmin/histsci/htmlform/lect4.html|title=History of Computation|author=Dunne, Paul E.|accessdate=2007-05-14}}</ref>.  The difference engine's physical algorithm was based on a mathematical technique known as the Method of Differences, which Babbage contributed work on.  Unfortunately only a fragment of the machine would ever come to fruitition due to various financial disputes and accusations of fund mismanagement from the British Government.  More importantly, the machine was never fully developed due to Babbage's realization of a more improved machine called the Analytical Engine. Functionally, the Analytical Engine was capable of various algorithmic operations that were broken down into basic algebraic operations.  Two cards would be used to program the system: the first would detail what operations were required to be performed, and the second would contain the values to be operated on.  In this sense, the Analytical Machine was much like a computer, having an input(the algorithm as described on a card), a processor(the machine), an output(the result), and memory(using a storage method--the cards themselves). Like the pascaline, both the Difference and Analytical Engines relied on series of cogs and gears to compute values.
It would take [[Charles Babbage]], born on December 26, 1791 and inducted as a Fellow of the Royal Society to develop the first real successful automatic calculating machine<ref>{{cite web|url=http://web.mit.edu/invent/iow/babbage.html|title=Lemelson-MIT Program, Inventor of the Week Archive|date=February 2003|accessdate=2007-05-14|publisher=MIT.}}</ref>.  In 1821, Babbage developed the Difference Engine No. 1, which was a functional machine designed to compile mathematical tables based on polynomial calculation.<ref>{{cite web|url=http://www.csc.liv.ac.uk/~ped/teachadmin/histsci/htmlform/lect4.html|title=History of Computation|author=Dunne, Paul E.|accessdate=2007-05-14}}</ref>.  The difference engine's physical algorithm was based on a mathematical technique known as the Method of Differences, which Babbage contributed work on.  Unfortunately only a fragment of the machine would ever come to fruition due to various financial disputes and accusations of fund mismanagement from the British Government.  More importantly, the machine was never fully developed due to Babbage's realization of a more advanced machine called the Analytical Engine. Like the Pascaline, both the Difference and Analytical Engines relied on series of cogs and gears to compute values.
 
Functionally, the Analytical Engine was capable of various algorithmic operations that were broken down into basic algebraic operations.  Two cards would be used to program the system: the first would detail what operations were required to be performed, and the second would contain the values to be operated on.  In this sense, the Analytical Machine was much like a computer, having an input (the algorithm as described on a card), a processor (the machine), an output (the result), and memory (using a storage method--the cards themselves). Babbage's associate [[Ada Lovelace]] was arguably the first computer programmer, designing algorithms for the Analytical Engine.


'''Hollerith and punched cards (1884)'''
'''Hollerith and punched cards (1884)'''


[[Herman Hollerith]] was born on February 29, 1860 in New York.  In 1875 Hollerith attended the City College of New York, he graduated from the Columbia School of Mines in 1879 with an engineering degree.<ref>{{cite web|url=http://www-groups.dcs.st-and.ac.uk/~history/Biographies/Hollerith.html|title=Hollerith Biography|author=O'Connor, J. J. and Robertson, E. F.|date=July 1999|publisher=School of Mathematics and Statistics University of St. Andrews, Scotland|accessdate=2007-05-14}}</ref>  After graduating, Hollerith took up work with the United States Census Bureau, and was appointed Chief Special Agent.  Hollerith's contribution to computing was inspired by his work at the USCB, especially from Dr. John Shaw Billings who suggested that there should be a way to process the large amount of census data by some mechcanical means.  In 1884, Hollerith worked to develop a way to tabulate census information through the use of punch cards.  Eventually, he recognized that cards could be used as storage medium for census data.  His experiments lead to a process by where a pin would go through a hole in the card to complete an electrical circuit.  His system by which cards could be read and tabulated on a mechanical counter through a circuit completion was called the Hollerith Electric Tabulating System.  By 1890, the machines were improved so that a simple keyboard could be used to tabulate data instead of entry by hand.
[[Herman Hollerith]] was born on February 29, 1860 in New York.  In 1875 Hollerith attended the City College of New York, he graduated from the Columbia School of Mines in 1879 with an engineering degree.<ref>{{cite web|url=http://www-groups.dcs.st-and.ac.uk/~history/Biographies/Hollerith.html|title=Hollerith Biography|author=O'Connor, J. J. and Robertson, E. F.|date=July 1999|publisher=School of Mathematics and Statistics University of St. Andrews, Scotland|accessdate=2007-05-14}}</ref>  After graduating, Hollerith took up work with the United States Census Bureau, and was appointed Chief Special Agent.  Hollerith's contribution to computing was inspired by his work at the USCB, especially from Dr. John Shaw Billings who suggested that there should be a way to process the large amount of census data by some mechanical means.  In 1884, Hollerith worked to develop a way to tabulate census information through the use of punch cards.  Eventually, he recognized that cards could be used as storage medium for census data.  His experiments lead to a process by where a pin would go through a hole in the card to complete an electrical circuit.  His system by which cards could be read and tabulated on a mechanical counter through a circuit completion was called the Hollerith Electric Tabulating System.  By 1890, the machines were improved so that a simple keyboard could be used to tabulate data instead of entry by hand.


==Prerequisites to the first electronic computers==
==Prerequisites to the first electronic computers (early 1900's)==


'''Switches'''
'''Switches'''


The discovery of electricity, along with the invention of [[Electronic switch|electronic switches]], including the solenoid and the switching vacuum tube, were necessary pre-requisites to the invention of electronic computers.
The discovery of electricity, along with the invention of [[Electronic switch|electronic switches]] in the early 1900's, including the solenoid and the switching vacuum tube, were necessary pre-requisites to the invention of electronic computers.
 
'''Computational theory'''
 
Much work on the theory of computation was done in the 1930s. [[Alan Turing]] and [[Alonzo Church]] both came up with new techniques to solve the [[halting problem]], inventing the [[Turing Machine]] and [[lambda calculus]] respectively.


'''Switching algebra'''
'''Switching algebra'''


[[Boolean algebra]], an algebraic system consisting of only two values, was destined to become the basis for describing digital logic circuits used to build electronic computers.  The realization that "switching algebra", as boolean algebra came to be known by computer designers, could be used to describe logic circuits was a major conceptual breakthrough first documented by [[Claude Shannon]] (1916-2001) in his 1938 MIT master's thesis<ref name="Shannon3">{{cite web|url=http://www.research.att.com/~njas/doc/shannonbib.html|title=``A Symbolic Analysis of Relay and Switching Circuits'', MIT master's thesis published in T.A.I.E.E. Vol. 57 (1938), pp. 713-723|publisher= Transactions American Institute of Electrical Engineers|year=1938|accessdate=2007-05-12}}</ref>.  Shannon's thesis created a stir in the world of electronics in 1938<ref name="Shannon1">{{cite web|url=http://www.nyu.edu/pages/linguistics/courses/v610003/shan.html|title="Claude Shannon" from Professor Ray C. Dougherty's course notes (V61.0003) Communication: Men, Minds, and Machines (Fall, 1996)|publisher=[[Microsoft Corporation]]|year=1996|accessdate=2007-05-12}}</ref>, though Shannon was better known in later years for founding the field of [[information theory]].
[[Boolean algebra]], invented by [[George Boole]] in the 1850's, is an algebraic system consisting of only two values, and it was destined to become the basis for describing digital logic circuits used to build electronic computers.  The realization that Boolean algebra could be used to describe logic circuits was a major conceptual breakthrough first documented by [[Claude Shannon]] in his 1937 MIT master's thesis<ref name="Shannon3">{{cite web|url=http://www.research.att.com/~njas/doc/shannonbib.html|title=``A Symbolic Analysis of Relay and Switching Circuits'', MIT master's thesis published in T.A.I.E.E. Vol. 57 (1938), pp. 713-723|publisher= Transactions American Institute of Electrical Engineers|year=1938|accessdate=2007-05-12}}</ref>.  Shannon's thesis created a stir in the world of electronics when it began circulating in 1938<ref name="Shannon1">{{cite web|url=http://www.nyu.edu/pages/linguistics/courses/v610003/shan.html|title="Claude Shannon" from Professor Ray C. Dougherty's course notes (V61.0003) Communication: Men, Minds, and Machines (Fall, 1996)|publisher=[[Microsoft Corporation]]|year=1996|accessdate=2007-05-12}}</ref>, though Shannon was better known in later years for founding the field of [[information theory]].  Boolean algebra subsequently became known as "switching algebra" by computer designers.


==The first electronic computers (1940's)==
==1940s: The first electronic computers==


Early electronic computing research was carried out by academics, industrialists and entrepreneurs, but the bulk of the funding (with the exception of Konrad Zuse) was from the American and British government military during World War II.  During the war, computing research was carried out under heavy secrecy, but after the war, computing was adopted rapidly by industry.
Prior to World War II, the word ''computer'' generally meant a person who computes.  But in the early 1940's (during World War II), the first electronic '''[[Computer|computers]]''' were developed to perform numerical calculations far faster than humans couldWith the notable exception of the Zuse Z3, which was developed in Germany in relative isolation, the first electronic computers were mainly the result of secret military projects funded by the British and U. S. governments<ref name="Colossus">{{cite web|url=http://www.picotech.com/applications/colossus.html|title=Colossus: The World’s First Electronic Computer|publisher=Pico Technology|year=date_not_specified|accessdate=2007-04-24}}</ref><ref name="Eniac">{{cite web|url=http://www.seas.upenn.edu/~museum/|title=The ENIAC Museum Online|publisher=University of Pennsylvania School or Engineering and Applied Sciences (SEAS)|year=date_unspecified|accessdate=2007-04-23}}</ref>.  


'''Zuse Z3 (1941)'''
'''Zuse Z3 (1941)'''


German engineer [[Konrad Zuse]] built the Z1 computer between 1936 and 1938.  German patent applications provide evidence of Zuse's development of a mechanical memory device in 1936, used in the Z1.<ref>(German) Zuse, Konrad: Verfahren zur selbsttätigen Durchführung von Rechnungen mit Hilfe von Rechenmaschinen. Patentanmeldung Z 23 139 / GMD Nr. 005/021 / Jahr 1936. {{cite web|url=http://www.epemag.com/zuse/Bgraphy.htm#ZUSE36|title=Konrad Zuse: Bibliography.|accessdate=2007-05-16}}</ref>.  Zuse built the Z2 sometime between 1936 and 1939, and the Z3 from 1939 to 1941<ref>{{cite web|url=http://www.epemag.com/zuse/|title=The Life and Work of Konrad Zuse|accessdate=2007-05-16|publisher=Wimborne Publishing LTD and Maxfield & Montrose Interactive Inc|author=Zuse, Horst}}</ref>.  The German government was not supportive of Zuse's work, and evidence of the Z3's existence was discovered by Allied forces only after the end of World War II.  All photographs of the Zuse Z3 were destroyed by allied raids during the war<ref name="Zuse1">{{cite book|url=http://www.amazon.com/Portraits-Silicon-Robert-Slater/dp/0262691310|title="Portraits in Silicon" by Robert Slater, ch. 5, p. 43|publisher=The MIT Press|year=1987}}</ref>. Zuse's constructions incorporated advanced concepts, including the implementation of the [[binary numeral system]].  Having survived the war, Zuse built another computer in Switzerland, and later was the first designer to propose [[pipelining]] the computations of a computer [[CPU|processor]]. In 1949, Zuse formed Zuse KG, where he worked until 1966.  Zuse's Z3 is now recognized as probably having been the first general purpose electro-mechanical computer.
German engineer [[Konrad Zuse]] built the Z1 computer between 1936 and 1938.  German patent applications provide evidence of Zuse's development of a mechanical memory device in 1936, used in the Z1.<ref>(German) Zuse, Konrad: Verfahren zur selbsttätigen Durchführung von Rechnungen mit Hilfe von Rechenmaschinen. Patentanmeldung Z 23 139 / GMD Nr. 005/021 / Jahr 1936. {{cite web|url=http://www.epemag.com/zuse/Bgraphy.htm#ZUSE36|title=Konrad Zuse: Bibliography.|accessdate=2007-05-16}}</ref>.  Zuse built the Z2 sometime between 1936 and 1939, and the Z3 from 1939 to 1941<ref>{{cite web|url=http://www.epemag.com/zuse/|title=The Life and Work of Konrad Zuse|accessdate=2007-05-16|publisher=Wimborne Publishing LTD and Maxfield & Montrose Interactive Inc|author=Zuse, Horst}}</ref>.  The German government was not supportive of Zuse's work, and evidence of the Z3's existence was discovered by Allied forces only after the end of World War II.  All photographs of the Zuse Z3 were destroyed by allied raids during the war<ref name="Zuse1">{{cite book|url=http://www.amazon.com/Portraits-Silicon-Robert-Slater/dp/0262691310|title="Portraits in Silicon" by Robert Slater, ch. 5, p. 43|publisher=The MIT Press|year=1987}}</ref>. Zuse's constructions incorporated advanced concepts, including the implementation of the [[binary numeral system]].  Having survived the war, Zuse built another computer in Switzerland, and later was the first designer to propose [[pipelining]] the computations of a computer [[CPU|processor]]. In 1949, Zuse formed Zuse KG, where he worked until 1966.  Zuse's Z3 is now recognized as probably having been the first general-purpose electro-mechanical computer.


'''Atanasoff pre-computer (1942)'''
'''Atanasoff pre-computer (1942)'''
Line 124: Line 68:
'''Colossus (1943)'''
'''Colossus (1943)'''


Britain's Colossus project produced a series of about ten electronic computers used by British codebreakers to read encrypted German messages during World War II.  The first Colossus prototype was initially completed by engineer Tommy Flowers in 1943 at the Post Office Research Station, Dollis Hill, with input from mathematician Max Newman and a few others.  It used the [[binary numeral system]] for calculations, utilizing vacuum tubes and very fast optical punch tape readers.  By 1944, the project moved to [[Bletchley Park]] and lasted until the end of the war.  Shortly after, in 1946, Winston Churchill gave official orders to have the machines destroye
Britain's [[Colossus project]] produced a series of about ten electronic computers used by British code breakers to read encrypted German messages during World War II.  The first Colossus prototype was initially completed by engineer [[Tommy Flowers]] in 1943 at the Post Office Research Station, [[Dollis Hill]], with input from mathematician Max Newman and a few others.  It used the [[binary numeral system]] for calculations, utilizing vacuum tubes and very fast optical punch tape readers.  By 1944, the project moved to [[Bletchley Park]] and lasted until the end of the war.  Shortly after, in 1946, Winston Churchill gave official orders to have the machines destroyed


'''Harvard Mark I (1944)'''
'''Harvard Mark I (1944)'''


For decades after World War II, it was widely believed that the IBM Automated Sequence Controlled Calculator(ASCC) (completed in 1944 and later called the Mark I) was the first electromechanical general-purpose computer<ref>{{cite web|url=http://www-03.ibm.com/ibm/history/exhibits/markI/markI_intro.html|title=IBM Archives:IBM's ASCC (a.k.a. The Harvard Mark I)|accessdate=2007-05-15|publisher=IBM}}</ref>.  The idea for the Harvard Mark I automatic digital calcuator was conceived by [[Howard H. Aiken]], then a graduate student from Harvard University with a Ph. D. in theoretical physics.  The machine was a hybrid of mechanical and electronic technology, performing calculations through a series of small gears, electro-mechanical counter wheels, and switches. Input occurred via punched cards, paper tape or through manually set switches to indicate the values to be processed.  The output was generated by an electric typewriter or punched into additional cards.  The successor to the Mark I, the Mark II, would still used relays, but also featured an electrical memory and a system of 'constant' values that were referenced during run-time.<ref>{{cite web|url=http://web.mit.edu/invent/iow/aiken.html|date=October 2002|accessdate=2007-05-15|publisher=MIT.|title=Lemelson-MIT Program, Inventor of the Week Archive}}</ref>
For decades after World War II, it was widely believed that the IBM Automated Sequence Controlled Calculator(ASCC) (completed in 1944 and later called the Mark I) was the first electromechanical general-purpose computer<ref>{{cite web|url=http://www-03.ibm.com/ibm/history/exhibits/markI/markI_intro.html|title=IBM Archives:IBM's ASCC (a.k.a. The Harvard Mark I)|accessdate=2007-05-15|publisher=IBM}}</ref>.  The idea for the Harvard Mark I automatic digital calculator was conceived by [[Howard H. Aiken]], then a graduate student from Harvard University with a Ph. D. in theoretical physics.  The machine was a hybrid of mechanical and electronic technology, performing calculations through a series of small gears, electro-mechanical counter wheels, and switches. Input occurred via punched cards, paper tape or through manually set switches to indicate the values to be processed.  The output was generated by an electric typewriter or punched into additional cards.  The successor to the Mark I, the Mark II, would still used relays, but also featured an electrical memory and a system of 'constant' values that were referenced during run-time.<ref>{{cite web|url=http://web.mit.edu/invent/iow/aiken.html|date=October 2002|accessdate=2007-05-15|publisher=MIT.|title=Lemelson-MIT Program, Inventor of the Week Archive}}</ref>


'''ENIAC (1946)'''
'''ENIAC (1946)'''
Line 136: Line 80:
'''UNIVAC and EDVAC (late 1940's)'''
'''UNIVAC and EDVAC (late 1940's)'''


The designers of ENIAC jointly formed the Eckert-Mauchly Computer Corporation in 1946, which was bought by Remington Rand in 1950.  In 1951, this company delivered the first U. S. commercial computer, called the [[UNIVAC]], to the United States Census Bureau.  The UNIVAC was a [[stored-program]] computer, like its non-commercial sister the [[EDVAC]].  The Electronic Discrete Variable Automatic Computer was funded by the U.S. Army Ballistics Research Laboratory and was built at the Aberdeen Proving Ground in Maryland.  EDVAC was build by the University of Pennsylvania's ENIAC designers Eckert and Mauchly, and by John von Neumann and some others.  The EDVAC was the first computer to implement the stored-program concept.  The idea was first published in von Neumann's 1945 report [http://www.virtualtravelog.net/entries/2003-08-TheFirstDraft.pdf First Draft of a Report on the EDVAC]<ref> [http://www.virtualtravelog.net/entries/2003-08-TheFirstDraft.pdf "First Draft of a Report on the EDVAC"] ([[PDF]] format) by John von Neumann, Contract No.W-670-ORD-4926, between the United States Army Ordnance Department and the [[University of Pennsylvania]]. [[Moore School of Electrical Engineering]], University of Pennsylvania, [[June 30]], [[1945]].  The report is also available in {{cite book| first=Nancy| last=Stern| title=From ENIAC to UNIVAC: An Appraisal of the Eckert-Mauchly Computers| publisher=Digital Press| year=1981}}</ref>.  Although its design predates the UNIVAC, the EDVAC did not become fully operational until 1952.  Competing fiercely with IBM, the company eventually built 46 of the earliest commercial computer systems.
The designers of ENIAC jointly formed the Eckert-Mauchly Computer Corporation in 1946, which was bought by Remington Rand in 1950.  In 1951, this company delivered the first U. S. commercial computer, called the [[UNIVAC]], to the United States Census Bureau.  The UNIVAC was a [[stored-program]] computer, like its non-commercial sister the [[EDVAC]].  The Electronic Discrete Variable Automatic Computer was funded by the U.S. Army Ballistics Research Laboratory and was built at the Aberdeen Proving Ground in Maryland.  EDVAC was build by the University of Pennsylvania's ENIAC designers Eckert and Mauchly, and by John von Neumann and some others.  The EDVAC was the first computer to implement the stored-program concept.  The idea was first published in von Neumann's 1945 report [http://www.virtualtravelog.net/entries/2003-08-TheFirstDraft.pdf First Draft of a Report on the EDVAC]<ref> [http://www.virtualtravelog.net/entries/2003-08-TheFirstDraft.pdf "First Draft of a Report on the EDVAC"] ([[PDF]] format) by John von Neumann, Contract No.W-670-ORD-4926, between the United States Army Ordnance Department and the [[University of Pennsylvania]]. [[Moore School of Electrical Engineering]], University of Pennsylvania, [[June 30]], 1945.  The report is also available in {{cite book| first=Nancy| last=Stern| title=From ENIAC to UNIVAC: An Appraisal of the Eckert-Mauchly Computers| publisher=Digital Press| year=1981}}</ref>.  Although its design predates the UNIVAC, the EDVAC did not become fully operational until 1952.  Competing fiercely with IBM, the company eventually built 46 of the earliest commercial computer systems.
 
==1950s: Hardware and the first compiler==


==1950s: Early programming tools==
In the decades after World War II, computers grew rapidly in usefulness while decreasing in size and cost, spawning a huge and complex industry to create hardware and software for them.  But in the 1950's, they were still huge and expensive and available only to a few people.


'''Assemblers'''
'''Assemblers'''
Line 149: Line 95:


==1960s: Batch operating systems==
==1960s: Batch operating systems==
The 1960s brought an era of commercial corporate use of computers for the first time. Groups of people used [[punched cards]] or [[paper tape]] to write programs, and then a [[computer operator]] would take the pile of stacked cards that build up and submit them as individual jobs to a large computer.


'''Batch operating systems'''
'''Batch operating systems'''
Line 156: Line 104:
'''Multics'''
'''Multics'''


Need a paragraph here with pointers to existing or future articles about specific project, products, companies or technologies.
[http://multicians.org Multicians.org] has a huge amount of info waiting to be mined on [[Multics]]


==1970s: Networks, time-sharing, languages, and smaller hardware==
==1970s: Networks, better software, and smaller hardware==


'''early networks'''
===DARPA and the early networks===


Need a paragraph here with pointers to existing or future articles about specific project, products, companies or technologies.
Some of the critical key concepts included the concepts of [[packet switching]] from [[Paul Baran]] and [[Leonard Kleinrock]], and of interconnected heterogeneous networks, first called '''catenets''', by [[Louis Pouzin]] and [[Vinton Cerf]], with [[J.R. Licklider]] providing project direction and government support.  These concepts led, by the 1980's, to the [[Internet]].


'''time-sharing operating systems'''
===Time-sharing operating systems: the birth of UNIX and C===


Need a paragraph here with pointers to existing or future articles about specific project, products, companies or technologies.
Need a paragraph here with pointers to existing or future articles about specific project, products, companies or technologies.


'''mini-computers'''
===Mini-computers: hardware gets smaller===


Need a paragraph here with pointers to existing or future articles about specific project, products, companies or technologies.
Need a paragraph here with pointers to existing or future articles about specific project, products, companies or technologies.


'''computer programming languages'''
=== Computer programming languages ===
During the 1970s, a variety of new programming languages emerged - the [[C programming language]] being the one which is still widely used in system programming. In addition, a variety of higher level languages emerged: Charles Moore's [[Forth programming language|Forth]], the logic programming language [[Prolog programming language|Prolog]], and the beginning of the competitive process to develop [[Ada programming language|Ada]] was started, which would later become the language used for military and aviation programming. The Lisp dialect, [[Scheme programming language|Scheme]] was also created, which would later become the language used in [[Structure and Interpretation of Computer Programs]], the introductory programming course for [[Massachusetts Institute of Technology|MIT]] undergraduates.


Need a paragraph here with pointers to existing or future articles about specific project, products, companies or technologies.
Many programming languages also became more standardized, with the [[American National Standards Institute]] (ANSI) publishing specifications for COBOL, FORTRAN 77 and MUMPS.


==1980's==
==1980s: Networks and personal computers, oh my!==


===personal computers===
===personal computers===


===the internet===
The "big three" (Apple, Commodore and Tandy-Radio Shack) duke it out with the [[IBM compatible PC]] and [[Microsoft MS-DOS]]
 
===The internet===
 
Need a paragraph here with pointers to existing or future articles about specific project, products, companies or technologies.  ''(Note: HTTP / the Web is specifically excluded from this decade, because the WWW came later)''
 
===DNS===
===electronic mail===


====electronic mail====
===newsgroups===


====newsgroups====
===TCP applications (FTP, gopher, telnet, WAIS, Archie / Veronica, etc)===


==1990s: WWW, and several software revolutions==
==1990s: WWW, and several software revolutions==


'''The web'''
===Dial-up internet access for the masses===
 
Many small companies, and some large ones such as [[CompuServe]], sell dial-up internet access accounts to the public.  Suddenly, anyone could get online and have email, although email systems were not always compatible with each other.  Combined with the availability of home (personal) computers, internet usage grew astronomically.
 
===The World Wide Web===
 
In 1992, Tim Berners-Lee published the [[HTTP]] protocol and the [[HTML]] language, and the first [[web browser]] (Mosaic) became available.  These led to what is now called the [[World Wide Web]] (or just WWW).


In 1992, Tim Berners-Lee published the [[HTTP]] protocol and the [[HTML]] language, and the first [[web browser]] (Mosaic) became available.  These led to what is now called the [[World_wide_web]] (or just WWW).
===Object-oriented programming goes wild===


'''object-oriented programming'''
Experimental computer programming languages based on object-oriented concepts were developed decades earlier, but in 1992, [[James Gosling]] and his team at [[Sun Microsystems]] began developing its [[Java platform]] which soon took the programming world by storm. Java provided an alternative to C++ for many, and is now widely used for programming on both the server and the client, as well as being taught as the first language for many students on computer science programs. Java is now a mainstay for enterprise computing. Java, like many languages before and since, uses a [[virtual machine]] architecture - the programmer writes in the Java programming language, which compiles to bytecode, which itself runs on a virtual machine that abstracts away the underlying platform. The idea is that it supposedly makes it simpler to write cross-platform applications, although in practice, this is rarely as easy as the early Java hype made it sound. Java's popularity spread widely, with Java variants appearing on mobile phones ([[JavaME]]), the enterprise server ([[JavaEE]]) and on web pages (the brief phenomena of Java applets) - to the point where some people started criticising Java's dominance<ref>Paul Graham, [http://www.paulgraham.com/javacover.html Java's Cover]</ref><ref>Paul Graham, [http://www.paulgraham.com/gh.html Great Hackers]</ref><ref>Paul Graham, [http://www.paulgraham.com/pypar.html The Python Paradox]</ref>.


Experimental computer programming languages based on object-oriented concepts were developed decades earlier, but in 1992, [[Sun Microsystems]] began developing its [[Java platform]] which soon took the programming world by storm.
There are also a wide variety of new programming languages appearing during the 1990s, often aimed at absolute beginners. Microsoft created [[Visual Basic]] which worked both on it's own and as part of the Office applications. 'Scripting' languages also became extremely popular: the Macintosh platform had [[AppleScript]] introduced, which has a "natural language" syntax designed for non-programmers (and which many programmers find cumbersome<ref>Vincent Gable, [http://imlocation.wordpress.com/2007/09/24/nsapplescript-sucks/ (NS)AppleScript Sucks], IMLocation Development Blog</ref>). [[Python programming language|Python]] was a significant development, providing a simple scripting language that anyone could start using, but which had advanced object-oriented features. [[Ruby programmming language|Ruby]] became popular later, offering some of the same benefits.


'''open-source software'''
=== Open-source software shakes things up ===


After a decade of failed attempts to make the popular [[Unix operating system]] run on low-cost Wintel hardware, Linus Torvalds released the [[Linux operating system]].  Seeds planted in the 1980's by Richard Stallman's [[Open Software Foundation]] finally took root, and the [[open source software]] movement really took off, having a disruptive (and arguably positive) effect on the entire software economy.
After a decade of failed attempts to make the popular [[Unix operating system]] run on low-cost Wintel hardware, Linus Torvalds released the [[Linux operating system]].  Seeds planted in the 1980's by Richard Stallman's [[Open Software Foundation]] finally took root, and the [[open source software]] movement really took off, having a disruptive (and arguably positive) effect on the entire software economy.


'''XML'''
=== XML, the universal translator ===


Another step on the road to operating system (platform) independence, this text-based standard for self-describing data had a huge impact.  As major programming languages such as [[Java]] and [[C#]] developed XML parsers, these languages could finally exchange information with each other, and much more.
Another step on the road to operating system (platform) independence, this text-based standard for self-describing data had a huge impact.  As major programming languages such as [[Java]] and [[C#]] developed XML parsers, these languages could finally exchange information with each other, and much more.


'''MP3'''
=== MP3: Honey, I shrunk the music ===


This compression algorithm led to digital music players and file sharing and much more.  (need more here!)
This compression algorithm led to digital music players and file sharing and much more.  (need more here!)


==after 2000==
==after 2000==
{{seealso|Convergence of communications}}
'''Wireless networking'''
Need a paragraph here with pointers to existing or future articles about specific project, products, companies or technologies.\
{{r|Locality of networks|Wireless local area network}}
{{r|Cellular telephony}}
{{r|Satellite communications}}
'''Google, Maps, Mashups, Amazon, Yahoo'''
Need a paragraph here with pointers to existing or future articles about specific project, products, companies or technologies.
{{r|Mashup}}
'''Global Positioning System (GPS)'''
Need a paragraph here with pointers to existing or future articles about specific project, products, companies or technologies.
{{r|Global Navigation Satellite System}}


===wireless networks===
'''Social networking'''


===GPS===
Need a paragraph here with pointers to existing or future articles about specific project, products, companies or technologies.  MySpace, Facebook, LinkedIn.  Also, Digg, Del.icio.us, StumbleUpon, etc.


===virtualization===
'''Telephony on the internet (VOIP)'''


===social networking===
{{r|Voice over Internet Protocol}}


===telephony on the internet===
'''Virtualization: one computer, many operating systems'''
 
Need a paragraph here with pointers to existing or future articles about specific project, products, companies or technologies.


==Famous people in history of computing==
==Famous people in history of computing==
For now, see this [[list of people who made conceptual breakthroughs in computer science]].
For now, see this [[Computer_science/Catalogs/Breakthroughs|list of people who made conceptual breakthroughs in computer science]].


==Famous concepts in history of computing==
==Famous concepts in history of computing==
For now, see this [[list of seminal concepts in computer science]].
For now, see this [[Computer_science/Catalogs/List_of_seminal_concepts_in_computer_science|list of seminal concepts in computer science]].
 
 


==References==
==References==
<references/>
{{reflist|2}}[[Category:Suggestion Bot Tag]]

Latest revision as of 11:00, 28 August 2024

This article is developing and not approved.
Main Article
Discussion
Related Articles  [?]
Bibliography  [?]
External Links  [?]
Citable Version  [?]
 
This editable Main Article is under development and subject to a disclaimer.

Writing about the history of computing is challenging because of the complexity of any one computer, the speed with which computer technology has evolved, and the many different types of computers that have been built. Further, since computing reaches into so many different industries (such as telephony, automotives, or cameras) and has spawned a huge industry for the making of hardware and software for computers, it is difficult to know where such a history should stop.

This article is organized in sequential order, to present the major events in the development of general-purpose computing devices, including the basic evolution of their hardware components and software concepts. The article is organized by decade. Technologies appear, not in the decade in which they were first imagined or in which experimental versions became available, but in the decade in which they became widely accepted.

Early devices (ancient times)

Long before the arrival of mechanical computing, ancient civilizations devised various methods to calculate and keep track of numbers.

Salamis Tablet (300 B.C.)

A very early counting device, the Salamis Tablet[1], was used by the Babylonians to track numbers in their society.

Abacus (300 B.C.)

The abacus, a mechanical aid to performing arithmetic, dates back many centuries and is still in use in various forms.

Mechanical computing (Renaissance to 1900)

The human longing for mechanical help in performing complex computations existed for a long time before technology was advanced enough to realize a practical solution. These are some of the attempts to create computing machines before technology was sufficiently advance to make that very feasible.

Codex Madrid (Da Vinci)

On 13 February 1967, the "Codex Madrid", written by Leonardo Da Vinci, was discovered in the National Library of Spain by Dr. Roberto Guatelli[2]. Inside the Codex Madrid was a drawing for an elaborate mechanical computational device. Guatelli noticed that a similar construct appeared in Da Vinci's "Codex Atlanticus". A prototype of this machine was created in 1968, and was observed that it exhibited traits that of a ratio machine. One revolution of the first shaft(10^1) invoked ten revolutions of the second(10^2), repeating until the last shaft which rotated at a rate of ten to the power of 13. Whether this was a true computational device was under some debate. Previously been displayed at IBM, the exhibit was removed due to a nonconsensus, and is presumed to be in one of IBM's storage facilities.

Pascaline (1642)

An early mechanical computational device is the Pascaline, created by Blaise Pascal circa 1642.[3] The Pascaline performed simple addition and subtraction. The concept of the Pascaline came about from the carrying of places by gear rotation. Functionally, the machine worked by increasing values on a single cog, which ranged from values 0 to 9. Upon the next rotation, a series of cogs would rotate the next gear over one iteration to read 1 while the first cog would reset back to 0.[4]

Charles Babbage (early 1800's)

It would take Charles Babbage, born on December 26, 1791 and inducted as a Fellow of the Royal Society to develop the first real successful automatic calculating machine[5]. In 1821, Babbage developed the Difference Engine No. 1, which was a functional machine designed to compile mathematical tables based on polynomial calculation.[6]. The difference engine's physical algorithm was based on a mathematical technique known as the Method of Differences, which Babbage contributed work on. Unfortunately only a fragment of the machine would ever come to fruition due to various financial disputes and accusations of fund mismanagement from the British Government. More importantly, the machine was never fully developed due to Babbage's realization of a more advanced machine called the Analytical Engine. Like the Pascaline, both the Difference and Analytical Engines relied on series of cogs and gears to compute values.

Functionally, the Analytical Engine was capable of various algorithmic operations that were broken down into basic algebraic operations. Two cards would be used to program the system: the first would detail what operations were required to be performed, and the second would contain the values to be operated on. In this sense, the Analytical Machine was much like a computer, having an input (the algorithm as described on a card), a processor (the machine), an output (the result), and memory (using a storage method--the cards themselves). Babbage's associate Ada Lovelace was arguably the first computer programmer, designing algorithms for the Analytical Engine.

Hollerith and punched cards (1884)

Herman Hollerith was born on February 29, 1860 in New York. In 1875 Hollerith attended the City College of New York, he graduated from the Columbia School of Mines in 1879 with an engineering degree.[7] After graduating, Hollerith took up work with the United States Census Bureau, and was appointed Chief Special Agent. Hollerith's contribution to computing was inspired by his work at the USCB, especially from Dr. John Shaw Billings who suggested that there should be a way to process the large amount of census data by some mechanical means. In 1884, Hollerith worked to develop a way to tabulate census information through the use of punch cards. Eventually, he recognized that cards could be used as storage medium for census data. His experiments lead to a process by where a pin would go through a hole in the card to complete an electrical circuit. His system by which cards could be read and tabulated on a mechanical counter through a circuit completion was called the Hollerith Electric Tabulating System. By 1890, the machines were improved so that a simple keyboard could be used to tabulate data instead of entry by hand.

Prerequisites to the first electronic computers (early 1900's)

Switches

The discovery of electricity, along with the invention of electronic switches in the early 1900's, including the solenoid and the switching vacuum tube, were necessary pre-requisites to the invention of electronic computers.

Computational theory

Much work on the theory of computation was done in the 1930s. Alan Turing and Alonzo Church both came up with new techniques to solve the halting problem, inventing the Turing Machine and lambda calculus respectively.

Switching algebra

Boolean algebra, invented by George Boole in the 1850's, is an algebraic system consisting of only two values, and it was destined to become the basis for describing digital logic circuits used to build electronic computers. The realization that Boolean algebra could be used to describe logic circuits was a major conceptual breakthrough first documented by Claude Shannon in his 1937 MIT master's thesis[8]. Shannon's thesis created a stir in the world of electronics when it began circulating in 1938[9], though Shannon was better known in later years for founding the field of information theory. Boolean algebra subsequently became known as "switching algebra" by computer designers.

1940s: The first electronic computers

Prior to World War II, the word computer generally meant a person who computes. But in the early 1940's (during World War II), the first electronic computers were developed to perform numerical calculations far faster than humans could. With the notable exception of the Zuse Z3, which was developed in Germany in relative isolation, the first electronic computers were mainly the result of secret military projects funded by the British and U. S. governments[10][11].

Zuse Z3 (1941)

German engineer Konrad Zuse built the Z1 computer between 1936 and 1938. German patent applications provide evidence of Zuse's development of a mechanical memory device in 1936, used in the Z1.[12]. Zuse built the Z2 sometime between 1936 and 1939, and the Z3 from 1939 to 1941[13]. The German government was not supportive of Zuse's work, and evidence of the Z3's existence was discovered by Allied forces only after the end of World War II. All photographs of the Zuse Z3 were destroyed by allied raids during the war[14]. Zuse's constructions incorporated advanced concepts, including the implementation of the binary numeral system. Having survived the war, Zuse built another computer in Switzerland, and later was the first designer to propose pipelining the computations of a computer processor. In 1949, Zuse formed Zuse KG, where he worked until 1966. Zuse's Z3 is now recognized as probably having been the first general-purpose electro-mechanical computer.

Atanasoff pre-computer (1942)

Between 1937 and 1942, Dr. John V. Atanasoff and graduate student Clifford Berry, of Iowa State University, worked on a prototype electronic computer that introduced key design ideas but which never completely realized as a general purpose computing device. Some of Atanasoff's ideas may have been communicated to John Mauchly, who later assisted in the development of the ENIAC.

Colossus (1943)

Britain's Colossus project produced a series of about ten electronic computers used by British code breakers to read encrypted German messages during World War II. The first Colossus prototype was initially completed by engineer Tommy Flowers in 1943 at the Post Office Research Station, Dollis Hill, with input from mathematician Max Newman and a few others. It used the binary numeral system for calculations, utilizing vacuum tubes and very fast optical punch tape readers. By 1944, the project moved to Bletchley Park and lasted until the end of the war. Shortly after, in 1946, Winston Churchill gave official orders to have the machines destroyed

Harvard Mark I (1944)

For decades after World War II, it was widely believed that the IBM Automated Sequence Controlled Calculator(ASCC) (completed in 1944 and later called the Mark I) was the first electromechanical general-purpose computer[15]. The idea for the Harvard Mark I automatic digital calculator was conceived by Howard H. Aiken, then a graduate student from Harvard University with a Ph. D. in theoretical physics. The machine was a hybrid of mechanical and electronic technology, performing calculations through a series of small gears, electro-mechanical counter wheels, and switches. Input occurred via punched cards, paper tape or through manually set switches to indicate the values to be processed. The output was generated by an electric typewriter or punched into additional cards. The successor to the Mark I, the Mark II, would still used relays, but also featured an electrical memory and a system of 'constant' values that were referenced during run-time.[16]

ENIAC (1946)

At the University of Pennsylvania, John Mauchly and J. Presper Eckert proposed the Electrical Numerical Integrator And Computer to the U.S. Army Ordnance Department's Ballistics Research Laboratory in 1943, and then served as its main designers until construction was finished in 1946. It was a military project justified by a need to compute ballistic trajectories, and was one of the earliest general-purpose, programmable electronic computers[17]. ENIAC's computations used the decimal numeral system, instead of the binary numeral system used by most subsequent digital computers. The ENIAC was not able to store its own program in memory; it had to be programmed by setting switches on function tables and by changing the wiring. Considerable human effort was required to reprogram it.

UNIVAC and EDVAC (late 1940's)

The designers of ENIAC jointly formed the Eckert-Mauchly Computer Corporation in 1946, which was bought by Remington Rand in 1950. In 1951, this company delivered the first U. S. commercial computer, called the UNIVAC, to the United States Census Bureau. The UNIVAC was a stored-program computer, like its non-commercial sister the EDVAC. The Electronic Discrete Variable Automatic Computer was funded by the U.S. Army Ballistics Research Laboratory and was built at the Aberdeen Proving Ground in Maryland. EDVAC was build by the University of Pennsylvania's ENIAC designers Eckert and Mauchly, and by John von Neumann and some others. The EDVAC was the first computer to implement the stored-program concept. The idea was first published in von Neumann's 1945 report First Draft of a Report on the EDVAC[18]. Although its design predates the UNIVAC, the EDVAC did not become fully operational until 1952. Competing fiercely with IBM, the company eventually built 46 of the earliest commercial computer systems.

1950s: Hardware and the first compiler

In the decades after World War II, computers grew rapidly in usefulness while decreasing in size and cost, spawning a huge and complex industry to create hardware and software for them. But in the 1950's, they were still huge and expensive and available only to a few people.

Assemblers

Need a paragraph here with pointers to existing or future articles about specific project, products, companies or technologies.

First compiler

Need a paragraph here with pointers to existing or future articles about specific project, products, companies or technologies.

1960s: Batch operating systems

The 1960s brought an era of commercial corporate use of computers for the first time. Groups of people used punched cards or paper tape to write programs, and then a computer operator would take the pile of stacked cards that build up and submit them as individual jobs to a large computer.

Batch operating systems

Need a paragraph here with pointers to existing or future articles about specific project, products, companies or technologies.

Multics

Multicians.org has a huge amount of info waiting to be mined on Multics

1970s: Networks, better software, and smaller hardware

DARPA and the early networks

Some of the critical key concepts included the concepts of packet switching from Paul Baran and Leonard Kleinrock, and of interconnected heterogeneous networks, first called catenets, by Louis Pouzin and Vinton Cerf, with J.R. Licklider providing project direction and government support. These concepts led, by the 1980's, to the Internet.

Time-sharing operating systems: the birth of UNIX and C

Need a paragraph here with pointers to existing or future articles about specific project, products, companies or technologies.

Mini-computers: hardware gets smaller

Need a paragraph here with pointers to existing or future articles about specific project, products, companies or technologies.

Computer programming languages

During the 1970s, a variety of new programming languages emerged - the C programming language being the one which is still widely used in system programming. In addition, a variety of higher level languages emerged: Charles Moore's Forth, the logic programming language Prolog, and the beginning of the competitive process to develop Ada was started, which would later become the language used for military and aviation programming. The Lisp dialect, Scheme was also created, which would later become the language used in Structure and Interpretation of Computer Programs, the introductory programming course for MIT undergraduates.

Many programming languages also became more standardized, with the American National Standards Institute (ANSI) publishing specifications for COBOL, FORTRAN 77 and MUMPS.

1980s: Networks and personal computers, oh my!

personal computers

The "big three" (Apple, Commodore and Tandy-Radio Shack) duke it out with the IBM compatible PC and Microsoft MS-DOS

The internet

Need a paragraph here with pointers to existing or future articles about specific project, products, companies or technologies. (Note: HTTP / the Web is specifically excluded from this decade, because the WWW came later)

DNS

electronic mail

newsgroups

TCP applications (FTP, gopher, telnet, WAIS, Archie / Veronica, etc)

1990s: WWW, and several software revolutions

Dial-up internet access for the masses

Many small companies, and some large ones such as CompuServe, sell dial-up internet access accounts to the public. Suddenly, anyone could get online and have email, although email systems were not always compatible with each other. Combined with the availability of home (personal) computers, internet usage grew astronomically.

The World Wide Web

In 1992, Tim Berners-Lee published the HTTP protocol and the HTML language, and the first web browser (Mosaic) became available. These led to what is now called the World Wide Web (or just WWW).

Object-oriented programming goes wild

Experimental computer programming languages based on object-oriented concepts were developed decades earlier, but in 1992, James Gosling and his team at Sun Microsystems began developing its Java platform which soon took the programming world by storm. Java provided an alternative to C++ for many, and is now widely used for programming on both the server and the client, as well as being taught as the first language for many students on computer science programs. Java is now a mainstay for enterprise computing. Java, like many languages before and since, uses a virtual machine architecture - the programmer writes in the Java programming language, which compiles to bytecode, which itself runs on a virtual machine that abstracts away the underlying platform. The idea is that it supposedly makes it simpler to write cross-platform applications, although in practice, this is rarely as easy as the early Java hype made it sound. Java's popularity spread widely, with Java variants appearing on mobile phones (JavaME), the enterprise server (JavaEE) and on web pages (the brief phenomena of Java applets) - to the point where some people started criticising Java's dominance[19][20][21].

There are also a wide variety of new programming languages appearing during the 1990s, often aimed at absolute beginners. Microsoft created Visual Basic which worked both on it's own and as part of the Office applications. 'Scripting' languages also became extremely popular: the Macintosh platform had AppleScript introduced, which has a "natural language" syntax designed for non-programmers (and which many programmers find cumbersome[22]). Python was a significant development, providing a simple scripting language that anyone could start using, but which had advanced object-oriented features. Ruby became popular later, offering some of the same benefits.

Open-source software shakes things up

After a decade of failed attempts to make the popular Unix operating system run on low-cost Wintel hardware, Linus Torvalds released the Linux operating system. Seeds planted in the 1980's by Richard Stallman's Open Software Foundation finally took root, and the open source software movement really took off, having a disruptive (and arguably positive) effect on the entire software economy.

XML, the universal translator

Another step on the road to operating system (platform) independence, this text-based standard for self-describing data had a huge impact. As major programming languages such as Java and C# developed XML parsers, these languages could finally exchange information with each other, and much more.

MP3: Honey, I shrunk the music

This compression algorithm led to digital music players and file sharing and much more. (need more here!)

after 2000

See also: Convergence of communications

Wireless networking

Need a paragraph here with pointers to existing or future articles about specific project, products, companies or technologies.\

Google, Maps, Mashups, Amazon, Yahoo

Need a paragraph here with pointers to existing or future articles about specific project, products, companies or technologies.

  • Mashup [r]: A data visualization created by combining data with multiple computer applications. [e]

Global Positioning System (GPS)

Need a paragraph here with pointers to existing or future articles about specific project, products, companies or technologies.

  • Global Navigation Satellite System [r]: A system which allows small electronic devices to determine their location (Longitude, Latitude, and Altitude) as well as time with an accuracy of up to a few centimetres using time signals transmitted along a line of sight by radio from satellites. [e]

Social networking

Need a paragraph here with pointers to existing or future articles about specific project, products, companies or technologies. MySpace, Facebook, LinkedIn. Also, Digg, Del.icio.us, StumbleUpon, etc.

Telephony on the internet (VOIP)

Virtualization: one computer, many operating systems

Need a paragraph here with pointers to existing or future articles about specific project, products, companies or technologies.

Famous people in history of computing

For now, see this list of people who made conceptual breakthroughs in computer science.

Famous concepts in history of computing

For now, see this list of seminal concepts in computer science.

References

  1. The Abacus:A Brief History. Retrieved on 2007-04-24.
  2. Kaplan, Erez. 1996. The Controversial Replica of Leonardo da Vinci's Adding Machine. Retrieved on 2007-04-30.
  3. Abernethy, Ken and Allen, Tom. 2004. Early Calculating and Computing Machines: From the Abacus to Babbage. Furman University. Retrieved on 2007-04-30.
  4. A simplified example of the functionality of the Pascaline. La Machine de Pascal:la pascaline (French: The Machine of Pascal: The Pascaline (literal)). Retrieved on 2007-05-04.
  5. Lemelson-MIT Program, Inventor of the Week Archive. MIT. (February 2003). Retrieved on 2007-05-14.
  6. Dunne, Paul E.. History of Computation. Retrieved on 2007-05-14.
  7. O'Connor, J. J. and Robertson, E. F. (July 1999). Hollerith Biography. School of Mathematics and Statistics University of St. Andrews, Scotland. Retrieved on 2007-05-14.
  8. ``A Symbolic Analysis of Relay and Switching Circuits, MIT master's thesis published in T.A.I.E.E. Vol. 57 (1938), pp. 713-723. Transactions American Institute of Electrical Engineers (1938). Retrieved on 2007-05-12.
  9. "Claude Shannon" from Professor Ray C. Dougherty's course notes (V61.0003) Communication: Men, Minds, and Machines (Fall, 1996). Microsoft Corporation (1996). Retrieved on 2007-05-12.
  10. Colossus: The World’s First Electronic Computer. Pico Technology (date_not_specified). Retrieved on 2007-04-24.
  11. The ENIAC Museum Online. University of Pennsylvania School or Engineering and Applied Sciences (SEAS) (date_unspecified). Retrieved on 2007-04-23.
  12. (German) Zuse, Konrad: Verfahren zur selbsttätigen Durchführung von Rechnungen mit Hilfe von Rechenmaschinen. Patentanmeldung Z 23 139 / GMD Nr. 005/021 / Jahr 1936. Konrad Zuse: Bibliography.. Retrieved on 2007-05-16.
  13. Zuse, Horst. The Life and Work of Konrad Zuse. Wimborne Publishing LTD and Maxfield & Montrose Interactive Inc. Retrieved on 2007-05-16.
  14. (1987) "Portraits in Silicon" by Robert Slater, ch. 5, p. 43. The MIT Press. 
  15. IBM Archives:IBM's ASCC (a.k.a. The Harvard Mark I). IBM. Retrieved on 2007-05-15.
  16. Lemelson-MIT Program, Inventor of the Week Archive. MIT. (October 2002). Retrieved on 2007-05-15.
  17. "The Eniac Museum Online", University of Pennsylvania School of Engineering Arts and Sciences. University of Pennsylvania. Retrieved on 2007-05-12.
  18. "First Draft of a Report on the EDVAC" (PDF format) by John von Neumann, Contract No.W-670-ORD-4926, between the United States Army Ordnance Department and the University of Pennsylvania. Moore School of Electrical Engineering, University of Pennsylvania, June 30, 1945. The report is also available in Stern, Nancy (1981). From ENIAC to UNIVAC: An Appraisal of the Eckert-Mauchly Computers. Digital Press. 
  19. Paul Graham, Java's Cover
  20. Paul Graham, Great Hackers
  21. Paul Graham, The Python Paradox
  22. Vincent Gable, (NS)AppleScript Sucks, IMLocation Development Blog