| Forum Home | ||||
| Press F1 | ||||
| Thread ID: 41208 | 2004-01-03 05:58:00 | Mainframes and Supercomputers | toxicbass (4045) | Press F1 |
| Post ID | Timestamp | Content | User | ||
| 205013 | 2004-01-03 05:58:00 | Hi I was watching BBC the othernite at perhaps 3am,they had a computer show they had- the inventor of mp3 the 'inventor' of the internet and they had a Supercomputer in USA, a DIY one. 1100 G5 Macs 2gb Ram each dual 400mhz Cpus. 1.7 ''Trillion(?) Floating point operations /sec'' it was late i may have mis heard that..perhaps it was only billion. Okay okay i dont like macs,i know they are certainly effective at some things.But how do PCs compare? can you please give me some links ? say some Intel dual 2ghz with same RAM each, Non cheapie mobo, and abnormally fast networking cable. I sit here at tech,in a room of 40pcs,with just one being used.i want to make use of them all! solve some mysterys! do some good! My mates intentions were to use that 'seti' screensaver and use all the pcs in the room to compute it all-to me its a load of crap,but its the thought that counts. lol :-) (link) www.idg.net.nz Cheers! |
toxicbass (4045) | ||
| 205014 | 2004-01-03 08:55:00 | Supercomputing is a pretty advanced topic (SEE: 3rd year computer systems engineering at Auckland University). But basically all the computers are networked and processes are divided between each CPU. Think threads in windows, where you have each program which usually makes up one thread and this 'thread' will be loaded to one CPU and processed from there. The program divides its processes between CPU's to lighten the load, hence finishing things faster. I believe in Massey University in Albany they have a super computer consisting of 200 Athlon XP 2000mhz processers. There shouldn't be much difference between PC's/Mac's (ceteris parabus), as they all need a special operating system and software to run on them, rather than Win/linux or OS X The software needs to be optimised for 200 CPU's, Battlefield1942 isn't going to own on 200 CPU's unless its been optimised to do so. I've seen some code for programming distributed computers and WOW you need an understanding of networking protocols and threads and everything. Its crazy (gotta love ASM/C++) SETI is apparently one of the worlds biggest distributed computing project. It's a great little program to have going when away from your computer, because obviously, everyone wants to find aliens... especially people with computers. - David |
DangerousDave (697) | ||
| 205015 | 2004-01-03 09:03:00 | I'm fairly sure Cancer research uses a distrbuted network as well just lately. | Elephant (599) | ||
| 205016 | 2004-01-03 09:20:00 | Hi toxicbass, If you are interested in clustering there are numerous sites. The main site has to be Beowulf (www.beowulf.org) if you are interested. :) There is also OpenMosix (http:) which is an interesting add-on for a type of clustering solution. HTH |
Gorela (901) | ||
| 205017 | 2004-01-04 01:17:00 | That G5 cluster is the 3rd fastest "computer" in the world. (There's a list of the fastest 500.) I can remember when one of the Crays had a cycle time of 80 ns. At that time, 80ns was the amount of time added to the cycle time of a PDP-11 if you added the memory parity test option :D A Burroughs 5700/6700 series mainframe had about 100 ns cycle for the FPU, and 200 ns for the CPU. Unfortunately, it's easier to build a cluster supercomputer than to use it. The problems have to be suited to the architecture, or you can finish up with one of the processors working hard on a bit of the problem while all the others idle, waiting for it. :-( Special compilers are needed too. A Google search with "stone soup" will find the origins of the Beowulf (Linux) systems. |
Graham L (2) | ||
| 1 | |||||