The New York Times reports today that Google and IBM are sinking $30 million into a two-year project to build remote data centers that can handle sophisticated computing research remotely. No World of Warcraft player will again be safe now that students can crunch probabilities with the 1600+ processors Google is installing in an undisclosed location.
But seriously: the two companies—along with six universities (Carnegie Mellon, Massachusetts Institute of Technology, Stanford University, the University of California, Berkeley, the University of Maryland and the University of Washington)—are cooperating to get an inadequately funded area of research off the ground. The Times succinctly defines “cloud-computing” as a “new kind of data-intensive supercomputing” that “often involves scouring the
Web and other data sources in seconds or minutes for patterns and
insights.” It’s typically used by major corporations to analyze web traffic and refine big systems, but now any university kid with a password will be able to create programs and software that can take advantage of the horsepower Google and IBM are providing. —Jacob Ward