Chips Can’t Get Much Smaller
SHARE

About every two years, transistors shrink in size enough to place double the number on an integrated circuit than was possible during the previous two years. It’s held true since the mid-1960’s when the idea was first posited by Gordon E. Moore (today, it’s called Moore’s Law). If you were to plot the rate on a graph, you’d see it come out as an exponential curve. Exponential curves start slowly and then ramp up quickly, theoretically approaching a limit but never reaching it. I say theoretically because in the very practical real world, a limit will always be reached due to environmental feedback. In silicon-based computing (what we use today), that limit may be only four years away.

Suman Datta, a researcher at Pennsylvania State University, has suggested we have only four years more of miniaturization left before we reach that practical limit for silicon chips. While companies like Intel have devised myriad techniques for circumventing these limits over the years, they will need to look elsewhere when the transistors reach the size at which they leak more current than they can retain.