The Department of Energy is getting a 10-petaflop supercomputer to help scientists design efficient electric car batteries, understand climate change and unravel cosmic mysteries.
The IBM-built system, nicknamed "Mira," will be operational at Argonne National Laboratory next year. At 10 quadrillion calculations per second, it will be twice as fast as today's fastest supercomputer and 20 times faster than Argonne's current model. If every person in the United States performed one calculation every second, it would take almost a year for them to do as many calculations as Mira will do in one second, according to IBM.
This kind of computing power means Mira can solve problems that were previously too big for the most powerful current supercomputers. It would take Mira two minutes to solve a problem that takes current supercomputers two years, IDG News reports.
Thanks to improved chip designs and an energy-efficient water-cooling system, Mira will also be one of the most energy-efficient supercomputers in the world, IBM said. It runs on IBM's Blue Gene/Q platform and its impressive specs include more than 750,000 processors and 750 terabytes of memory.
The DOE selected 16 projects to start off with, including reducing energy inefficiency in transportation and developing advanced engine designs. The system will be able to model tropical storms, battery performance and the evolution of the universe, along with other complex simulations.
IBM said Mira is a stepping stone toward exascale computing, which beats petascale computers by a factor of 1,000. Exascale computers could solve questions that have remained beyond our reach, such as understanding regional climate change and designing safe nuclear reactors.
Meanwhile, IBM is building another 10-petaflop model called Blue Waters for the University of Illinois at Urbana-Champaign's National Center for Supercomputing Applications. And Lawrence Livermore National Laboratory is getting a 20-petaflop IBM model called Sequoia.
Mira will be operational in 2012 and scientists from industry, academia and government institutions will be able to use it.
Can I play World of Warcraft on it?
It play for u man
I predict that "Mira" will be unable to unlock the secrets of the perfect battery, or model tropical storms, or the evolution of the universe. Instead, Mira will offer to design for us an even more complex super computer to solve our questions, and after doing so Mira will go back to watching TV.
Will it answer any of these mysteries in the form of a question? Or beat any Russians in chess? Or become self aware and decide battery performance is not on its to do list?
The world's most advanced computer could solve a mathematical problem that would take any human a year to figure in one second and it is stuck with a dead in job of designing batteries?! I got just one thing to say to that..... Must construct additional Pylons b*tches!!
42. There you go, millions of dollars saved in trying to unravel those cosmic mysteries.
I know this statement probably isn't true, but the first thing that popped in my head was along the lines of:
"Using the biggest energy hog to save energy" *Big Grin, Two Thumbs Up*
could you imagie building the thing. how do you keep track of 750,000 processors. How do you design something that uses 750,000 processors. good lord. what if 1 goes bad. they crazy thing is each one has millions of transistors inside. million times 750,000 is insane to think about. oh yeah.. I guess its pretty fast too.
I don't understand how these two statements can both be true.
"It would take Mira two minutes to solve a problem that takes current supercomputers two years,"
"At 10 quadrillion calculations per second, it will be twice as fast as today’s fastest supercomputer"
By definition the current fastest supercomputer is a "current supercomputer" and if you are only twice as fast as it then what you do in 2 minutes it does in 4 minutes. How do you get to two years? Even if I'm not talking about the current fastest, do all the supercomputers drop down so fast in power that this machine is truly 525600 * times faster then them?
2 Year in comparison to 2 minutes.
2 * 365 * 24 * 60 (two years in minutes) = 1051200 and to get it in two minutes 1051200 / 2 = 525600.
The current fastest computer in the world is 2.5 Pflops - this is 10Pflops so it's 4 times faster.
Being a computer, won't its predictions and analysis be just as flawed as the models given to it or am I just being closed-minded?
If it's going to make such incredible, far-reaching calculations, wouldn't we need a model of a system (say, the global climate and what we think we know about it) that's as close as possible to absolutely perfect?
It just seems like this is going to be a great way to be amazingly wrong BUT FASTER!
You would be correct once upon a time. It used to be that a computer was "only as smart as the program(er) made it".
That's still true; they arn't sentient or anywhere close; they're really just very complex and well integrated number crunchers. The most steroid-induced version of your home PC imaginable, and then some.
But programming has gotten more dynamic. Computers are no longer given models of things like climate to run; they are programmed to crunch massive amounts of data, run simulations, and through the massive trial-and-error capabilities at its fingertips, develope its OWN model, and then test and refine it against actual data.
As evolution (as well as genetic algorithms) has shown, it is possible to subsitute intelligence and innovation with randomness and trial-and-error, given enough time.
A supercomputer can only go as fast as the program allows it to. A great example of this is a standard home computer vs. any supercomputer... if you don't make the necessary tweaks and optimizations to the system, my little i3 processor could do better then their 10,000 processors. Perhaps this supercomputer is built differently so it doesn't need as many "tweaks" to the software so even though its only 4x faster, it really could be 10x faster.
Actually the "tweaks" are that you pick the right problems. There are some problems that can only be solved by doing one thing at a time. Given that kind of problem then to solve it the fastest you are looking for processor that can do those steps the fastest.
But if you pick a problem that from the start is massively parallel that is a whole another game. And that is the case of the problems these computers will take on. For instance the what if you have the data temperature, pressure, humidity, ... from a 100 million sensors from all over the world each sampling once every 1/10 of a second. And for each of these points on the earth you need to calculate some formula, and whats more you want include say all the sensors adjacent to each other in the formula. It is pretty easy to see that you can keep 750,000 processors quite busy. And your single processor wouldn't have a chance to even be close to fast enough to keep up.
And I think people are getting the wrong idea about what these computers will be doing. They aren't going to go off and "think" and come up with a great idea or model. They are going to process massive amounts of data (that has been gathered for long periods of time) and test many theories as fast as possible. Another words they are going to "cheat" when it comes to the making breakthroughs.
Example which is smarter. A computer that can look be taught the rules of chess, and every game ever played, and is so fast that it runs every game ever played, plus 10 million more games every second and then takes 2 minutes doing this and then moves the best move out of all those combinations. Or a human that can only think of a few dozen games, but has a strategy and can change that strategy as it needs to? Clearly the computer will win, but it didn't win because it is better making judgements where data is lacking.
And that is key in the world of intelligence. In this world we are seldom given all the time and data we would like. We can't wait until the year 2100 while we put out sensors and gather data to decide if climate change is real or not. If you wait until then you won't need to predict anything you will know because it will have already happened. Computers are great at crunching numbers, but as it sits right now Computer are not very good at making a leap in knowledge without the data.
So many components, it reminds me of the stories about ENIAC and UNIVAC blowing vacuum tubes. Lol, thank goodness for micro transistors.
Calculation speed alone does not determine a computer’s overall speed. There are many other components in this machine that contribute to the computer’s ability to outperform current supercomputers by a large margin. Memory size, memory type, memory speed, bus speed, hard drive size, hard drive speed, and the software that runs the computer are just a couple of examples.
The Technological singularity is happening.
Mankind is pregnant.
I wish they would start using supercomputers to help solve economic issues in this country such as to determine personal and corporate tax methods and study how it effects economic growth. I would like to see spending and taxation be considered more of a scientific, mathematic issue rather than a partisan political issue like it is now.
Best computer yet is kinda hard to believe when it hits the market I'll pick one up and review it on youtube
@tcolguin. Very good point.
All this hype and 'wow' factor statements are all wel and good for a 'sexy-tech' read but the bottom, line is Quantity of data is not the same as quality of output.
Recursive Multiple parallel processing with crossover error detection and correction might work for 'scientific' solutions but the question then arises. 'Will we be ready to deal with the answers?'
What we need is a system that can measure the impact of implementing such discoveries and inventions upon several interacting social dimensions (politics, resources, market forces, environment)
In a world fraught with nations on the brink of financial and military turmoil over dwindling resources and climate change. Perhaps we should set aside 50% of the processing power to analyse scenarios for solutions that will not add to existing crises.
@eregorn8 (in addition to what tcolguin posted) that is why super computers are made from CPU and GPU.
They each have their advantages and disadvantages.
everyone here agrees that the 3.2 ghz intel i7 holds the record for fastest CPU on the planet right now.
on certain tests and computing circumstances the GPU out of 2 year old Nivida GTX 280 was faster than the i7. in some cases 2.5 times faster!!!!!
no CPU on the planet can touch the parallel processing power of a good GPU (that is why they are used for games, you need parallel processing to do graphics) but if you just have a straight forward mathematical equation a cpu will be faster.
I wander what arhitecure that have to program for.
@eregorn8 I also think its kinda funny that you bring up they need good programing to use this beast. not everyone is really aware of this, but SURPRISE today's leading software developers are SHIT POOR at using the parallel processing powers of today's CPUs. Windows is a GREAT example. sure a few things here and run a little faster, but swapping out your HD for a SD drive will show FAR more improvements than going from a 4 core to 6 core processor. right now the only way these dual, quad, and 6 core cpus are getting utilized is when you use more than one program a time. VERY few programmers are utilized parallel processing using today's mutli core processors for 1 single program (games do it, ps3 does it, xbox 360 does) standard software makers... don't seem to care.
I pre ordered 3 of these.