It’s a mystery why Gordon Moore’s “law,” which forecasts processor
power will double every two years, still holds true a half century later
It is now called Moore’s law, although Moore (who co-founded the chip maker Intel) doesn’t much like the name. “For the first 20 years I couldn’t utter the term Moore’s law. It was embarrassing,” the 86-year-old visionary said in an interview with New York Times columnist Thomas Friedman at the gala event, held at Exploratorium science museum. “Finally, I got accustomed to it where now I could say it with a straight face.” He and Friedman chatted in front of a rapt audience, with Moore cracking jokes the whole time and doling out advice, like how once you’ve made one successful prediction, you should avoid making another. In the background Intel’s latest gadgets whirred quietly: collision-avoidance drones, dancing spider robots, a braille printer—technologies all made possible via advances in processing power anticipated by Moore’s law.
Of course, Moore’s law is not really a law like those describing gravity or the conservation of energy. It is a prediction that the number of transistors (a computer’s electrical switches used to represent 0s and 1s) that can fit on a silicon chip will double every two years as technology advances. This leads to incredibly fast growth in computing power without a concomitant expense and has led to laptops and pocket-size gadgets with enormous processing ability at fairly low prices. Advances under Moore’s law have also enabled smartphone verbal search technologies such as Siri—it takes enormous computing power to analyze spoken words, turn them into digital representations of sound and then interpret them to give a spoken answer in a matter of seconds.
Another way to think about Moore’s law is to apply it to a car. Intel CEO Brian Krzanich explained that if a 1971 Volkswagen Beetle had advanced at the pace of Moore’s law over the past 34 years, today “you would be able to go with that car 300,000 miles per hour. You would get two million miles per gallon of gas, and all that for the mere cost of four cents.”
Moore anticipated the two-year doubling trend based on what he had seen happen in the early years of computer-chip manufacture. In his 1965 paper he plotted the number of transistors that fit on a chip since 1959 and saw a pattern of yearly doubling that he then extrapolated for the next 10 years. (He later revised the trend to a doubling about every two years.) “Moore was just making an observation,” says Peter Denning, a computer scientist at the Naval Postgraduate School in California. “He was the head of research at Fairchild Semiconductor and wanted to look down the road at how much computing power they’d have in a decade. And in 1975 his prediction came pretty darn close.”
But Moore never thought his prediction would last 50 years. “The original prediction was to look at 10 years, which I thought was a stretch,” he told Friedman last week, “This was going from about 60 elements on an integrated circuit to 60,000—a 1,000-fold extrapolation over 10 years. I thought that was pretty wild. The fact that something similar is going on for 50 years is truly amazing.”
Just why Moore’s law has endured so long is hard to say. His doubling prediction turned into an industry objective for competing companies. “It might be a self-fulfilling law,” Denning explains. But it is not clear why it is a constant doubling every couple of years, as opposed to a different rate or fluctuating spikes in progress. “Science has mysteries, and in some ways this is one of those mysteries,” Denning adds. Certainly, if the rate could have gone faster, someone would have done it, notes computer scientist Calvin Lin of the University of Texas at Austin.
Many technologists have forecast the demise of Moore’s doubling over the years, and Moore himself states that this exponential growth can’t last forever. Still, his law persists today, and hence the computational growth it predicts will continue to profoundly change our world. As he put it: “We’ve just seen the beginning of what computers are going to do for us.”