What It Is
GPU computing: Using a PC’s graphics processor to perform other tasks.
Why It’s Radical
The graphics-processing unit (GPU) that normally handles only visual effects is taking over duties from the CPU, the computer’s main chip. CPUs such as Intel’s Core i7 max out at four computing cores. But graphics chips have dozens of cores that, though not as versatile, are ideal for parallel processing—breaking complex tasks into smaller chunks that the many cores work on simultaneously.
How You’ll See It
Image- and video-editing applications are the first to take advantage of the GPU. The latest version of Adobe Photoshop, for example, can rotate and smoothly zoom in and out of photos, instead of moving in step-by-step increments. By using the graphics card, CyberLink’s PowerDirector 7 video-editing app renders advanced effects in high-def video up to five times as quickly.
New operating systems will open the GPU to even more programs. In Apple’s upcoming OS X 10.6 Snow Leopard, programmers can use a language called Open CL to write any application—from virus scanners to webcam software—so that it sends data to the GPU. Microsoft will do the same for Windows by revising its DirectX programming interface, which is used mainly for game development. The new DirectX 11 will appear in the Windows 7 operating system and as an add-on to Windows Vista.
CPU-maker Intel joins the game late this year or early next with its first high-end video processor, nicknamed Larrabee. Based on the company’s CPU designs, the chip will handle graphics as well as other number-crunching tasks.