About every two years, transistors shrink in size enough to place double the number on an integrated circuit than was possible during the previous two years. It's held true since the mid-1960's when the idea was first posited by Gordon E. Moore (today, it's called Moore's Law). If you were to plot the rate on a graph, you'd see it come out as an exponential curve. Exponential curves start slowly and then ramp up quickly, theoretically approaching a limit but never reaching it. I say theoretically because in the very practical real world, a limit will always be reached due to environmental feedback. In silicon-based computing (what we use today), that limit may be only four years away.