Anti-Buzz: Moore’s Law

by Andrew Emmott on August 30, 2014

in Anti-Buzz,Future Tech,General,Hardware

Andrew has been writing Anti Buzz for 4 years resulting in almost 200 articles. For the next several weeks we will revisit some of these just in case you missed it.

 

newface-620x461The Buzz Word: Moore’s Law Sometimes I feel the anti-buzz tag line invites a rude attitude and I want to skip it. This is one of those times. Moore’s Law is a very popularly understood concept, and there is nothing wrong with this popularity. Discussing Moore’s Law in the context of buzzwords is merely meant to be fun and informative, not a lesson in how “wrong” the average person is. That said, Moore’s Law is a buzzword; remember that the defining characteristic of buzzword isn’t strictly about accuracy, but about application that is broader than originally intended. The law takes many forms, but a minimalist, popular definition could be “Every 18 months, computing power doubles.” Moore himself was, strictly speaking, talking about the number of transistors that could be fit into the same amount of space. Despite a strong connection between the number of transistors and the performance of digital electronics, Moore was not directly addressing the falling price of hard drives or memory, or the mega-pixels in your camera, or the number of users on Facebook; yet every time something doubles, somebody wants to cite Moore’s Law.

Moore himself has joked that by the sound of it, you’d think he invented the exponential. This is an understandable reaction by the public. Transistors-per-square-inch is an obtuse thing to appreciate, and even when you draw the connection between that and processing power, people want to generalize this to a discussion about every statistic that could ever be used to describe their computer: RAM, screen resolution, storage space, battery life, you name it. Moore’s Law also puts a name to the problem the public has had with computing for the past two decades: it changes too fast. Moore’s Law gives you someone to blame. These popular generalizations are fine and not even that inaccurate, but what’s the real story behind Moore’s Law?

The True Moore’s Law: Gordon Moore, co-founder of Intel, was one of several who, in 1965, observed a trend in new integrated circuit technology. He stuck his neck out, and famously predicted that the number of transistors on one microchip would continue to double every year for at least the next ten years. In 1975, he reviewed the situation and revised his statement, claiming that they would continue to double every two years for the foreseeable future. Speaking strictly from the transistors-per-chip point of view, Moore’s two-year
Law still holds today. Interestingly, Moore never once claimed a turnover of 18 months, and never argued for anything other than the prowess of manufacturing technologies and how many transistors they could cram into one place. One notable thing to point out: Moore’s Law hasn’t held true for all these decades just because of some crazy computer magic, but because the integrated circuit industry, especially Intel, has used to law to guide their
long-term strategies and have seen it as something of a mandate for how fast and hard they need to improve their technology.

Power Doubling Every 18 Months: So where does the turnover of 18 months come from? This is the popularly understood time. An Intel colleague of Moore, David House, sought to give a more applied definition of Moore’s Law. He stated that the practical implications of Moore’s Law were that CPUs would double in speed every 18 months. This claim is much easier to digest if you are an outsider, but it is also more narrow; quite a bit more than processor speed matters to your machine’s performance. I think we all bandy about with “Moore’s Law” and not “House’s Law” because Moore’s Law impacts all facets of computing technology, which is what we need if we’re bringing mega-pixels and video RAM into the conversation. House was strictly speaking about processor speed and his law has become increasingly hard to judge in the past decade, with multi-core CPUs really muddying our ability to account for processing speed.

Beyond the CPU: So, Moore’s Law does impact the number of mega-pixels in your camera and the cost of your hard drive, but it is not a directrelationship. Not everything benefits from more transistors the same way a processor does. We vaguely wrap “number of transistors” up in the term “computing power”, use House’s number because it’s a bit sexier and, honestly, it’s close enough to correct that pedants like me shouldn’t really care. The important thing to understand is that most aspects of computing technology improve exponentially fast, which I suppose is yet another obtuse formulation. The other key concept is that processing power isn’t the only thing that matters, (The marketing campaigns of processor manufacturers have spent the last two decades convincing you otherwise), and “computing power” is a little more amorphous than just how many hertz your computer can flop over.

How much longer will Moore’s Law hold out? Nobody quite agrees, but there is good reason to suspect that it won’t last that much longer. Even with that knowledge, don’t get sucked into popular doomsday notions of what that means. When Moore’s Law finally fails it doesn’t mean that computers will stop getting better. In fact, it doesn’t even mean they will stop getting exponentially better – a common estimation is that improvements will only slow such that the number of transistors will double only every three years. That’s still pretty fast, and who know how many decades that will last.

by: at .

Share

Comments on this entry are closed.

Previous post:

Next post: