Is Moore's Law a Dead Letter Yet?
Moore's Law, in its popular form, says that computing power doubles every two years. For a long time it's been true, but exponential growth can't continue forever. The trend is slowing down.
This doesn't mean that the growth in computing power is over. It will keep increasing at an impressive rate. But doubling every two years means increasing by a factor of a thousand every twenty years, and the jump from terabytes to petabytes might take a little longer than that.
Gordon Moore said, back in 1965, that the number of transistors that would fit in a given circuit space was doubling every year. Later this slowed down to every two years. Electronic components can only shrink so much before they run into the limits of physics. The smallest transistors today consist of about a hundred atoms.
Alternatives to silicon
Silicon transistors are down to 10 nanometers, and 7-nanometer chips are on their way. There's talk of 5-nanometer electronics, but that may be the limit. More speed will have to come from a different material or a different basic technology. Engineers are looking at a lot of candidates.,
Carbon shows up in several forms. Carbon nanotubes are smaller than 10 nanometers and offer faster switching speeds than silicon. They dissipate heat better. Producing them is the hard part. Existing technology just can't create good-quality nanotubes. An engineering breakthrough will be necessary before manufacturers can turn out nanotube chips.
Others are looking at graphene, a hexagonal lattice of carbon atoms just one atom thick. Scientists have developed a graphene transistor, which requires lower power than silicon transistors and therefore can run faster. Some people are even looking at diamond-based processors.
One of the strangest ideas is going back to vacuum tubes, but on a microscopic scale. Everything old becomes new again.
Three-dimensional construction of processors might get more speed out of components without reducing their size, since it can bring them closer together. The big problem there will be heat dissipation.
Can multi-core computers keep the pace going? That's not clear. Multiple cores reach their full potential only when they run in parallel at full speed. Some kinds of tasks, like large-scale matrix math, allow a lot of parallel processing, but many kinds of operations don't.
What about quantum computers? They can offer vast increases in computing power, at least in theory. Adding just one "qubit" doubles a computer's power, so in principle they could leave Moore in the dust. So far they're just a lab curiosity, and their architecture is so strange it isn't clear how they'd be used for ordinary computing tasks.
The short and long term
The rate at which computing power grows might decrease over the next few years, as chip makers work for the next breakthrough. They're pouring a lot of money into research, since whoever can get a big speed jump will see a lot of profits.
As an average over the long term, Moore's law may still hold up. Rather than seeing a steady doubling, we might see a period of reduced growth followed by a big jump.
If we're heading into a few years with less growth, developers will be under pressure to get more performance out of the available power. Moore's Law has encouraged wasteful coding. Developers assume that if their code runs a little slow on the current generation of machines, the next one will take care of the problem. The result is that computers don't seem any faster to the user. As the Red Queen said, it takes all the running you can do just to stay in the same place.
Maybe it's time for this to change. Developers will have to concentrate on writing tighter, better optimized code. The race for speed will shift to software.
If you want to be one of the developers who'll push software performance to new levels, there's no better time to start learning than now. Contact DaVinci Coders to learn about the courses we offer.