In a 1965 paper, Intel co-founder Gordon E. Moore predicted that the number of transistors on an integrated circuit would double approximately every two years. This prediction has proven to be uncannily accurate over the years and has come to be known as Moore’s Law. But it’s not going to hold true forever, is it? Well, it’s believed that like all things good, Moore’s Law too will come to an end one day. The question that remains, though, is when. Noted theoretical (and often theatrical) physicist Michio Kaku feels he has the answer.
As far as quantum computing breakthroughs go, this latest one by a team of researchers from the U.S., Australia and South Africa is truly special. According to the researchers, a tiny crystal comprising only 300 atoms developed by them has paved the way for a “huge leap” in computing. A leap so vast, these researchers claim, that it would take a supercomputer larger than the known universe to do the kind of calculations possible with their “quantum simulator,” a special type of quantum computer. Hit the jump for more.
Powerful quantum computing and instantaneous long-distance quantum communication (ala the Normandy's quantum entanglement communicator in Mass Effect) sound well and good, but in reality, that sort of technology will never blossom unless we figure out how to create working quantum networks first. Oh wait! We have. Yesterday, scientists from the Max Planck Institute of Quantum Optics in Germany announced that they've created the first quantum link between two atoms located far away from one another physically.
We've heard you snickering in the corner. Quantum computing is definitely a solid theory; scientists have been able to make a couple of electrons dance to the same proverbial tune for a while now. But what use is that? Critics say that quantum theory is mostly a mind exercise and will never be able to scale up for useful applications. Well, one MIT quantum scientist is sick of hearing that crap, and Scott Aaronson is putting his money where his mouth is in the form of a $100,000 prize to anyone able to demonstrate that "scalable quantum computing is impossible in the physical world."
You know that game that mimes play, where they mimic your every action, pretending to be a mirror? Well, if we’re ever going to get down and dirty with some true quantum computing, scientists are probably going to have to teach photons to pull mime impressions en mass. A complex process called quantum entanglement makes it so any changes that happen to one particle happens to others as well; harnessing that power is the theoretical key to quantum computing. Now, researchers from the University of Bristol have created the world’s first fully programmable photon-entangling silica chip, which could be a major step towards true quantum processing.
Some of the biggest breakthroughs in future tech revolve around some of the smallest materials on Earth. Even calling these technologies "micro" is magnitudes of measure larger than their actual tiny sizes. From the nano-scaled heat transfer of Nanowick Cooling down to the single atomic-level of Graphene and Quantum Computing, our white papers will help you wrap your head around the maximum potential of these miniscule technologies.
For the Knights of the Round Table, the holy grail is, well, the holy grail. The holy grail for computer geeks is a little different, but perhaps just as legendary – quantum computing. While super performing PCs powered by quantum bits sound good in theory, achieving results in the real world is a lot harder than just talking about it. Fortunately, that doesn't stop scientists. A team of researchers at the National Institute of Standards and Technology have managed to entangle two ions using a small microwave device, which could be a key step in the quest for quantum computing.
We've already covered a new ThinkGeek gadget today, so let's keep the "Think Geek" ball rolling and talk about a concept that keeps real-life geeks awake at night, jittering at the thought of its awesomeness: quantum computing. Even though Lockheed Martin signed up to buy an underperforming "Maybe it's a quantum computer" from D-Wave One a few months back, the face-melting power we think of when uttering the words "Quantum computer" is still a long ways off. A pair of researchers at Purdue University just inched it a little bit closer to reality, however.
Researchers across the pond have stumbled on a major breakthrough that could expedite humanity's march toward quantum computing. Scientists from Bristol's Centre for Quantum Photonics have developed a photonic chip capable of performing “calculations that are exponentially more complex than before,” thanks to a complex photonic maneuver called quantum walk. Hitherto, all quantum walk experiments have only been performed with a single photon, but these British dudes are the first ones to pull it off with two identical particles of light.
"It is widely believed that a quantum computer will not become a reality for at least another 25 years. However, we believe, using our new technique, a quantum computer could, in less than ten years, be performing calculations that are outside the capabilities of conventional computers," said Professor Jeremy O'Brien, Director of the Centre for Quantum Photonics.
"Using a two-photon system, we can perform calculations that are exponentially more complex than before," says Prof O'Brien. "This is very much the beginning of a new field in quantum information science and will pave the way to quantum computers that will help us understand the most complex scientific problems."
Some would consider quantum computing the holy grail of computer technology, and while there's often talk of quantum breakthroughs, nothing ever seems to materialize. Hoping to change that, Canadian firm D-Wave Systems has developed what it claims to be working 16-qubit, 28-qubit, and 128-qubit quantum computer chips.
A portion of the chips were fabricated at NASA's Jet Propulsion Lab's microdevices lab in Pasadena, and though there's no shortage of skepticism among the research community, scientists who saw the work for themselves back D-Wave's credibility.
So does Google, who in a blog post last week said that it had been working with D-Wave to develop quantum computers to power a search of still images in a database of images, video, and PDFs. According to Google, it's been a three-year project that has started to pay off.
"Today, at the Neural Information Processing Systems conference (NIPS 2009), we show the progress we have made," Google wrote. "We demonstrate a detector that has learned to spot cars by looking at example pictures. It was trained with adiabatic quantum optimization using a D-Wave C4 Chimera chip."
Is it the holy grail? Not quite. Google went on to say that there are still many open questions. What's encouraging to Google, however, is that based on the company's experiments, D-Wave's detector platform performed better than those Google had trained using classical solvers running on PCs in the search giant's data centers.