Forum Home
Press F1
 
Thread ID: 28759 2002-12-30 10:13:00 Who intvented computers...and why csinclair83 (200) Press F1
Post ID Timestamp Content User
109980 2002-12-30 10:13:00 Just been wondering..who invented this marvoulous peice of equipment that sits in our lounges..at work..on our laps..and why?

mm light was invented so we can see..
phone..so we can communicate..
computer?? why...

and why is it called "computer" not something like i dunno..
csinclair83 (200)
109981 2002-12-30 10:23:00 The word "computer" stems from the verb "compute" which itself came from the French comput-r and the Latin computa-re. It was originally formed by adding com - (together) and putare - (to clear up, settle or reckon).

Together, "compute" means to estimate or determine by arithmetical or mathematical reckoning; to calculate, reckon, count. Then from the word "compute", the suffix "er" was added giving us the definitions we have for computer today.

Babbage first described a mechanical analytical "engine" in 1837.
Look it up on Google for a full explanation.
godfather (25)
109982 2002-12-30 10:57:00 There was a guy called Pythagoras who ran a secret society in Tarentum in Southern Italy about 700BC. According to at least one writer (Arthur Koestler in "the Sleepwalkers") he made the single greatest contribution to intellectual development of humanity before or since - that is, his single-minded insistence that EVERYTHING can be described in terms of number. His idea led to digitisation and ultimately to your computer (also incidentaly to the basis of most scientific experimentation).
Later, a number of other people invented/constructed calculating machines, Gottfried Wilhelm von Leibnitz (also invented calculus) made one, as did Blaise Pascal about 1650.
Charles Babbage, as noted, made a notable contribution, but the most significant development leading directly to the machine on your desk (or lap as the case might be) was Alan Turing's definition in the 1930's of the most general possible (universal) machine and some suggestions and development of how this might be constructed and used. He used his ideas to crack the German Enigma codes during Hitler's war and started using them to on try to get a realistic computer operatioonal after the war, although he and his group was beaten to the actual construction by Eckert and Mauchly in the USA about 1948.
Since then (the modern forms of) computers developed and developed.
You can get more detail on a lot of this from the internet - Google searches etc.
rugila (214)
109983 2002-12-30 19:44:00 Good replies folks.. my 2 dollars worth. even before pythagoras the chinese had the abacus, the funny wee counting frame digital? well you use your fingers [digits] to operate it.
rgds and happy ny Hugh
Hugh Jarse (1169)
109984 2002-12-30 20:23:00 Hugh Jarse. I wouldn't like you to miss the point, although of course it's totally your choice if you want to.
The concept of numbers and means of manipulating them were very well known before Pythagoras. His contribution was that EVERTHING can be described by or expressed in terms of number. It was well known and widely used that SOME things could be before his time.
Yes, digitisation does come from your fingers and toes, but some early peoples used the spaces between their fingers as the basis for counting - leading to a system based on 4 or 8 for general numerical work rather than the 10 widely used these days. It may even have had some influence on the development of computing had this approach come to predominate, since the 1's and 0's (i.e. on or off, there or not there) on which I understand that all digital computers operate become fuzzier if you talk about a space (such as one between your fingers) that is either there or not there.
rugila (214)
109985 2002-12-31 02:42:00 As for why? Lewis Richardson thought (about 1911) that it would be worthwhile to be able to predict the weather . He sketched a system using 64000 computers (people) in a "large" room . He even worked out over several months (on his own) a six-hour forecast for a small area in Germany for which there was a lot of data . The results were "grossly" out, and it has always been a problem (even now with supercomputers) that a forecast (even if it's correct) is no use if it comes a few weeks late . ;-) . Have look at google, using "richardson weather forecasting" for some good articles .

The "first" electro-mechanical computer was probably Zuse's in 1938 . It didn't work very well, but he kept at it and produced machines commercially after WW2 . His machines were not used for war purposes . In 1961 the Zuse 23 had 256 words (40 bits long) of ferrite core memory and an 8k word magnetic drum . It could do 10000 additions/sec with a 150 kHz clock .

Various computers were designed in the US during WW2, but I don't think any of them was running before the end of the war .

The history of computers is distorted because the Bletchley Park work on codes was kept totally secret until 1974 . The Colossus was a "real" computer, even though the amount of programmability was limited . Other machines (like the "Bombes") there were not computers . They were electro-mechanical pattern matchers .
Graham L (2)
109986 2002-12-31 03:01:00 To put a non scientific opinon on the table, computers, or maby just Windows was invented so that propecia share-holders could become filthy rich :D:p nz_liam (845)
109987 2002-12-31 03:16:00 Chris, You might like to look at The_History_of_Computing (www.thocp.net/) and also The Virtual Museum of Computing (http:) to get a good overview.

Cheers, Babe,
Babe Ruth (416)
109988 2003-01-20 09:03:00 An interesting point of history - just trying to remember off the top of my head..
Intel's first mass production processor was invented for Casio in Japan for some earlier calculators. Intel being crafty added a lot of extra functions onto the processors as they were very under used for their specs required and I think that Casio later got told of these extra functions which sort of spoiled their next line of calculators, they already had the technology in front of them.

Another interesting fact is that in 1982, AMD's 80286-16/S CPU was manufactured by Intel.

The Abacus was indeed the first calculator and was considered the first 'adding machine'.

As for todays $ standards, IBM invented the PC/Computer & Microsoft invented Software. Oh, Linus Trovalis invented Open Source :P
kiwistag (2875)
1