What It Is
GPU computing: Using a PC's graphics processor to perform other tasks.
Why It's Radical
The graphics-processing unit (GPU) that normally handles only visual effects is taking over duties from the CPU, the computer's main chip. CPUs such as Intel's Core i7 max out at four computing cores. But graphics chips have dozens of cores that, though not as versatile, are ideal for parallel processing—breaking complex tasks into smaller chunks that the many cores work on simultaneously.
How You'll See It
Image- and video-editing applications are the first to take advantage of the GPU. The latest version of Adobe Photoshop, for example, can rotate and smoothly zoom in and out of photos, instead of moving in step-by-step increments. By using the graphics card, CyberLink's PowerDirector 7 video-editing app renders advanced effects in high-def video up to five times as quickly.
New operating systems will open the GPU to even more programs. In Apple's upcoming OS X 10.6 Snow Leopard, programmers can use a language called Open CL to write any application—from virus scanners to webcam software—so that it sends data to the GPU. Microsoft will do the same for Windows by revising its DirectX programming interface, which is used mainly for game development. The new DirectX 11 will appear in the Windows 7 operating system and as an add-on to Windows Vista.
CPU-maker Intel joins the game late this year or early next with its first high-end video processor, nicknamed Larrabee. Based on the company's CPU designs, the chip will handle graphics as well as other number-crunching tasks.
They've been utilizing the power of the GPUs for quite a while now. A prime example is folding@home, which harnesses the spare power available to a GPU while undertaking most tasks (though it is recommended you shut the client down before doing any 3D gaming!)
I don't know what took them so long to figure this out. CPU and GPU need to be able to work together.
It would also be beneficial to utilize memory in both directions as well. Back in the day there used to be a utility called VGARAM that would convert VGA ram to system memory.
I think all devices with onboard processor and memory should share unused resources in order to create a "Total System Resource Sharing". You can use my acronym TSRS, lol.