In the past sixty years or so, computers have migrated from room-size megaboxes to desktops to laptops to our pockets. But the real history of machine-assisted human computation (“computer” originally referred to the person, not the machine) goes back even further.
First in the historical record was the abacus, helping the ancient technorati gain an edge over trading partners still counting cows and amphorae by hand. The oldest known complex computing device, called the Antikythera mechanism, dates back to 87 B.C; it’s surmised the Greeks used this gear-operated contraption (found in a shipwreck in the Aegean Sea early in the 20th century, though its significance wasn’t realized until 2006) to calculate astronomical positions and help them navigate through the seas. Computing took another leap in 1843, when English mathematician Ada Lovelace wrote the first computer algorithm, in collaboration with Charles Babbage, who devised a theory of the first programmable computer. But the modern computing-machine era began with Alan Turing’s conception of the Turing Machine, and three Bell Labs scientists invention of the transistor, which made modern-style computing possible, and landed them the 1956 Nobel Prize in Physics. For decades, computing technology was exclusive to the government and the military; later, academic institutions came online, and Steve Wozniak built the circuit board for Apple-1, making home computing practicable. On the connectivity side, Tim Berners-Lee created the World Wide Web, and Marc Andreessen built a browser, and that’s how we came to live in a world where our glasses can tell us what we’re looking at. With wearable computers, embeddable chips, smart appliances, and other advances in progress and on the horizon, the journey towards building smarter, faster and more capable computers is clearly just beginning.
Infographic by Julie Rossman.
Marlon R. Narag says
Tru in our computer desktop equipments, portable gadgets appliances, devices, electronics supply, Loptop its how vacuum tubes transistor, ic or micro chips microproccessor of how tru connected sub multiple multiple of units of how deka hecto kilo gega mega tuna ah eh just kidding mam’sir its mega tru tera not tira tira in tagalog of class a b c d and e of economy of how positive exponent of frequency oh how neg ex of 1 2 3 6 9 12 of deci centi milli micro onanometer ah eh just kidding its nanometer tru pico tru tagalog pag hnd palam kumuha ng piko mag okketamnuang kana lng hehehe tru pangalatok gabyon,MARLONRNARAG
Mehul Michael Jay Desai says
Alan Turing…committed suicide at 41 in 1954. Might’ve lived twice that number of years & contributed so much more.
Ralph Seguin says
Pffft. These computer things are just a fad! They’ll never catch on. Soon you’ll be saying you will be able to connect them and they will be able to communicate
Otis Sigers says
Commodore Vick, Commodore 64 where I began. Played some with Fortran, Basic, went from there.
Where is George Bool around 1840s? The father of Boolean Algebra?
[…] http://www.worldsciencefestival.com/2013/12/a_history_of_computer_science/ […]
[…] There you will find 19455 more Infos: worldsciencefestival.com/2013/12/a_history_of_computer_science/ […]