There have been a few article’s surrounding IBM’s announcement that the perennial Moore’s Law is dead, or at least will be soon. It’s a controversial statement, but I’d say, one that most people trained as computer scientists (the old school kind that spent more time working at the interfaces between hardware and software in the early days of computing versus those who spend a lot of time inside environments and frameworks in the recent era), have to recognize as being accurate.
Moore’s Law is often misinterpreted as “Computing power doubles every 2 to 3 years”. The co-founder of Intel, Gordon Moore, “Moore’s Law” namesake, merely observed that the transistor density of silicon chips doubled about every 2 years. While this does translate often into computer performance, it has other implications then that alone.
Despite what appears to be the continued frenzied pace of development in all things IT related, people tend to forget all the underlying foundations of computer systems. So at first glance this statement of computer advances slowing seems ridiculous, but it merely refers to one kind of advance… which tends to have far reaching effects throughout IT.
I agree with IBM’s statement. The pace of any fundamental advances in “on die” electronics (ie. Processors/CPUs) has slowed to a crawl. Some may argue that is not true, but this is only because some people only look at the macro-advances. That is essentially, performance tricks… variations on a theme or discovering phenomena that can be exploited for specific use cases. In the end, none of these are fundamental shifts, they are simply iterative advances. And I am not knocking that, there is some pretty smart stuff in all that.
But let me illustrate both concepts.
About a decade or more ago, the same alarm was raised and nothing ever came of it.
Let start with the problem(s). There are two big physical limitations with processors.
- Electron Flow Through Wires
When electricity runs down wires, it generates a small electric field. As you place wires closer and closer together, eventually the field of one wire will start to interfere with the other (and the other will do the same). So there is a physical limit on how close you can place wires before the interference becomes problematic. This is often referred to as Cross Talk.
- Controlling Electron Flow
Signals in digital circuits are often controlled by the use of “logic gates”. These gates are basically switches that turn electricity on or off. The actual physics of “logic gates” goes something like this. I have a conductor (ie. Wire or something), that, when in one state, allows electricity to flow through it. When in another state, prevents electricity from flowing through it. This seems simple enough, but the gotcha is, depending on how the gate functions, it requires more power to either allow or stop the signal from flowing… and almost always the process generates heat. This is because, “resistance” is used, both a physic’s phenomena and an actual electronic component, that trades off the flow of power for heat (ie. to restrict the flow of power, the resistor will convert the power to heat and expel it as a by-product; this is why your home computer generates so much heat and why CPU’s have to be cooled by fans and heat sinks). The circuits also make other trade offs too (creation of electric fields to store the power, etc), but heat generation is the one we can easily experience and detect with our senses and has a huge impact on computer performance and power consumption.
So probably somewhere around 1996 or so, the computer processor engineers and scientists were faced with these two problems… they couldn’t make chips smaller (because of Cross Talk) and gate technology stalled because in order to create faster circuits, they needed more power, which in turn required the gates to use more power, which in turn generated much more heat.
Silicon chip technology was dead and the only alternative at the time, Gallium Arsenide, was both much more expensive and a potential environmental problem if it became a mass market technology. (Think, land fills full of the heavy metal Gallium and Arsenic [a poison], lovely right?)
So how was this death knell for Silicon avoided?
A smart physicist figured out how to fix the logic gate problem. In production, circuits are tested for speed. If they are too fast, the circuits are “doped” with electrons to slow them down. This way, if you buy a 3.0 Ghz processor, you get a 3.0 Ghz processor, not a 3.052345 Ghz processor.
Well, this physicist wondered… what if we “dope” the gates with something other then electrons… what about ions of elements? Well, he guessed right. After testing a few elements, if I recall correctly they started doping the gates with Xenon ions. In effect, this placed a “rough road” on the gate that would prevent electrons from leaking over the gate, when a small charge of electricity is applied, it would neutralize the ion’s effects and allow power to flow over the gate.
Along with other advances, like dealing more effectively with cross talk, this bought silicon technology another decade.
Unless someone figures something else like this out, eventually silicon will need to be replaced.
This is the big question…
If not… there are two technologies on the horizon, thankfully, neither is Gallium Arsenide.
Nano-photonics and Quantum Computing.
My opinion is, nano-photonics will dominate computers eventually.
As a trained computer scientist, I love the idea of Quantum Computing, and these kinds of computers will eventually be built, but there is a big difference in the application of photonics over quantum computing.
Quantum computing is more or less aimed as enabling us humans to effectively deal with, by current standards anyway, intractable math problems. This is important to many sciences, particularly engineering and physics. But it does not lend itself to more mundane things, like database applications, email, tweeting, virtualization, web servers or other run-of-the-mill IT stuff.
Although I have no doubt there will be some melding of the two, since many big IT companies right now have compute engines to analyze, categorize and predict things, about their users; from that perspective, they may have a use for Quantum Computing.
Certainly… crackers will… the bad guys, will *love* quantum computers… technically, cracking passwords *is* an intractable math problem. A quantum computer could theoretically make passwords and most kinds of encryption… well…. no longer intractable… possibly even… useless.
Isn’t that a cheery thought?