Will the hand-wringing over Moore's Law never stop? Intel's announcement that its next-generation 14nm process will be delayed three months triggered yet another round of fretting over the fate of this widely misunderstood "law".
Much of the panic is because Intel's "tick-tock" strategy has indeed operated like clockwork, chiming a smaller geometry every two years. Slippage is common at other companies, but not at Intel. So when the world's largest semiconductor vendor stops the clock for three months, hearts begin palpitating.
In a 1965 paper, Intel co-founder Gordon E. Moore predicted that the number of transistors on an integrated circuit would double approximately every two years. This prediction has proven to be uncannily accurate over the years and has come to be known as Moore’s Law. But it’s not going to hold true forever, is it? Well, it’s believed that like all things good, Moore’s Law too will come to an end one day. The question that remains, though, is when. Noted theoretical (and often theatrical) physicist Michio Kaku feels he has the answer.
It’s very likely that today’s lab curiosities represent possibilities that will redesign our world. Here are a few things to watch out for. (We’ll check back in a few years and see if my crystal ball needs recalibrating.)
Despite arguments over the technicalities of Moore's Law, the bottom line is we've seen fairly consistent performance increases throughout the years in the microprocessor industry. The problem with this, says Bill Gates, is that the same expectations can't be applied to other tech sectors.
"We've all been spoiled and deeply confused by the IT model," Gates said in response to a question from the audience during last week's Techonomy conference. "Exponential improvement -- that is rare."
That isn't to say that certain tech segments never see that kind of growth, and according to Gates, you can "see it in hard disk storage, fiber capacity, gene-sequencing rates, biological databases, [and] improvements in modeling software," to name a few. But in other areas, like battery development, exponential growth just isn't a reality.
"They [batteries] haven't improved hardly at all," Gates said. "There are deep physical limits. I am funding five battery start-ups and there are probably 50 out there. [But] that is a very tough problem. It may not be solvable in any sort of economic way."
Ready for a bold prediction? By the year 2011, processors will have broken the 10GHz barrier. So says Intel, but there's a caveat: The Santa Clara chip maker made this prediction nearly 10 years ago, long before the company ditched its Netburst architecture like a crazy ex-girlfriend (we've all been there, right?).
At the time, Intel also said it was working on a system bus that would run faster than its then-upcoming 400MHz (effective) Pentium 4 bus. Boy how times have changed. At this point, it seems far more likely that we'll see 10-core processors (or more) before 10GHz is ever realized, which underscores just how difficult it is to make reliable predictions in the tech industry.
"If 10GHz is the best that Intel can do by 2011, AMD or somebody else is going to eat their lunch," a Geek.com reader commented in response to Intel's 10GHz prediction. "Intel better pick up the pace if they want to remain dominate."
Ten years ago, Intel probably wasn't worried about GPUs encroaching on CPUs, but if you ask Nvidia, graphics chips are the future of computing. Not only that, but Nvidia's Bill Daily, chief scientist and senior vice president of research at Nvidia, recently wrote an obituary for Moore's Law, saying that "the CPU scaling predicted by Moore's Law is now dead."
Do you think we'll ever see 10GHz processors? If so, when? Will CPUs be eclipsed by GPUs during the next decade? Hit the jump and post your tech predictions!
Some 45 years ago, Gordon Moore, the co-founder of Intel, published a paper predicting that the number of transistors on an integrated circuit would double each year. This would later be revised to every 18 months and became known as Moore's Law, which while technically incorrect, has also been used to predict the doubling of CPU performance in the same time frame. According to Nvidia, Moore's Law is no more.
"Moore's paper also contained another prediction that has received far less attention over the years," Bill Daily, chief scientist and senior vice president of research at Nvidia wrote in Forbes. "He projected that the amount of energy consumed by each unit of computing would decrease as the number of transistors increased. This enabled computing performance to scale up while the electrical power consumed remained constant. This power scaling, in addition to transistor scaling, is needed to scale CPU performance.
But in development that's been largely overlooked, this power scaling has ended. And as a result, the CPU scaling predicted by Moore's Law is now dead."
Daily went on to say that CPU performance no longer doubles every 18 months, claiming this "poses a grave threat" to the industries that have traditionally relied on the growth in processor performance. According to Daily, a fundamental change in our approach to computing is needed, and multi-core processing is not the answer. Not surprisingly, he sees GPUs as the key.
"Parallel computers, such as graphics processing units, or GPUs, enable continued scaling of computing performance in today's energy-constrained environment," Daily argues. "Every three years we can increase the number of transistors (and cores) by a factor of four."
Read all of what Daily had to say here, and then hit the jump and tell us if you think the future of computing lies in the GPU.
Moore’s Law states that approximately every two years, the number of transistors that can be placed on an integrated circuit doubles. This has held true for the last 50 years. But there will come a point one day when physics puts a stop to that. Eventually the boundaries of atomic scale will limit transistor density. However, a new breakthrough in the field of quantum computing may provide hope for future advances. Until now, a quantum computing device had to be designed for one, and only one, operation. But scientists from the National Institute of Standards and Technology (NIST) have constructed the first programmable quantum processor.
Quantum processing units are fundamentally different in a number of ways. First, where a regular bit can be only 1 or 0, a quantum bit (or qubit) only assumes a value of 1 or 0 when it is observed. Additionally, Quantum computers aren’t bound by Boolean operators like ‘and, ‘or’ and ‘not’. Finally, two qubits can be “entangled”, meaning they will always have the same value when observed, even if separated.
The NIST computer consists of two quantum gates, one single qubit gate and an entangled two qubit gate. The gates utilized two beryllium ions stimulated with UV lasers to represent operations. The test programs run came back with 79% correct results. Certainly not perfect, but a huge step forward. You won’t be dropping one of these into a socket on your motherboard anytime soon, but maybe someday.
Intel working in conjunction with Numonyx unveiled breakthrough technology that will keep Moore’s Law accurate. The new process will enable non-volatile memory to cost-effectively scale down to 5nm.
Without getting too technical, the companies were able to build upon phase-change memory (PCM) and create a new technology call “phase-change memory and switch” (PCMS). PCMS integrates a new thin-film selector that effectively lets the memory/selector layers stack very densely. The nature of PCM allows it operate in two ways: quick “RAM like” bit changing, and non-volatile storage.
It is unlikely well see devices using the technology for “many years” according to Al Fazio, Intel Fellow and director of memory technology development. However, this is a key first step in continuing to scale technology according to Moore’s Law.
Intel co-founder Gordon Moore once predicted that the number of transistors on an integrated circuit would double every 18 to 24 months, a prediction which has been famously dubbed Moore's Law. But according to market research firm iSuppli, the move to 18nm will signal the end of Moore's Law.
"The usable limit for semiconductor process technology will be reached when chip process geometries shrink to be smaller than 20nm, to 18nm nodes," said Len Jelinek, director and chief analyst, semiconductor manufacturing, for iSuppli. "At those nodes, the industry will start getting to the point where semiconductor manufacturing tools are too expensive to depreciate with volume production, i.e., their costs will be so high, that the value of their lifetime productivity can never justify it."
So when exactly will it happen? According to iSupply, in the year 2014. In 2007, Gordon Moore said his prediction could be upheld for at least another decade. Five years from now, one of them is going to be wrong.
When it comes to Moore’s law these days, it seems like everyone’s a cynic. However, now there’s one more reason to be optimistic about the future of miniaturization, as researchers have published a paper describing a lithography technique which may provide a new means of producing chip features smaller than 32nm.
The technique involves the use of quasiparticles called plasmons to focus light at an incredibly high resolution. Chris Lee at Ars Technica describes the technology: “A lens, based on plasmons, can be created by a set of concentric metal rings. The fields from the plasmons in each ring act in such a way as to create a tightly focused spot of light. In principle, these lenses could focus light tightly enough to create features about five to ten nanometers in size.”
The problem with plasmon lenses is that they must be positioned at just 20 nm away from the wafer. The scientists claim to have overcome this hurdle with their new technique, which uses air pressure to control the lens’s distance from the wafer.
Significantly, the new technique eliminates the need to create a new photomask for each revision to the chip, potentially lowering costs and speeding up development.