Powerful quantum computing and instantaneous long-distance quantum communication (ala the Normandy's quantum entanglement communicator in Mass Effect) sound well and good, but in reality, that sort of technology will never blossom unless we figure out how to create working quantum networks first. Oh wait! We have. Yesterday, scientists from the Max Planck Institute of Quantum Optics in Germany announced that they've created the first quantum link between two atoms located far away from one another physically.
We've heard you snickering in the corner. Quantum computing is definitely a solid theory; scientists have been able to make a couple of electrons dance to the same proverbial tune for a while now. But what use is that? Critics say that quantum theory is mostly a mind exercise and will never be able to scale up for useful applications. Well, one MIT quantum scientist is sick of hearing that crap, and Scott Aaronson is putting his money where his mouth is in the form of a $100,000 prize to anyone able to demonstrate that "scalable quantum computing is impossible in the physical world."
Think your USB 3.0 or Thunderbolt port delivers blazing fast transfer rates? You must not be a high-energy physicist. While the rest of the world was patiently waiting for Intel to drag Thunderbolt ports from Macs to PCs, a group of the aforementioned scientists and network engineers decided to get a little more proactive and develop a technology that transfers two-way data at a rate of 186 friggin’ Gbps per second – a new world record that makes the 10 Gbps offered by Thunderbolt absolutely sluggish.
For the Knights of the Round Table, the holy grail is, well, the holy grail. The holy grail for computer geeks is a little different, but perhaps just as legendary – quantum computing. While super performing PCs powered by quantum bits sound good in theory, achieving results in the real world is a lot harder than just talking about it. Fortunately, that doesn't stop scientists. A team of researchers at the National Institute of Standards and Technology have managed to entangle two ions using a small microwave device, which could be a key step in the quest for quantum computing.
We've already covered a new ThinkGeek gadget today, so let's keep the "Think Geek" ball rolling and talk about a concept that keeps real-life geeks awake at night, jittering at the thought of its awesomeness: quantum computing. Even though Lockheed Martin signed up to buy an underperforming "Maybe it's a quantum computer" from D-Wave One a few months back, the face-melting power we think of when uttering the words "Quantum computer" is still a long ways off. A pair of researchers at Purdue University just inched it a little bit closer to reality, however.
One surefire way to egg on the hacking community is to place ever increasing restrictions on your product, essentially daring black hat coders to find a back door. Nvidia is finding this out the hard way, after the GPU maker modified its latest PhysX drivers to prevent any non-Nvidia GPU from working, says news and rumor site The Inquirer.
And if that weren't enough, the latest version of PhysX also prevents physics processing unit (PPU) cards from working if it detects a non-Nvidia card in the system. That may have been the proverbial straw that broke the hacking community's back, and a hacker who goes by the handle GenL has put together some experimental code that stops Nvidia's drivers from shutting everything down when it detects a Radeon card.
We haven't tried it ourselves, but if you're feeling adventurous, rebellious, or both, you can grab the code here.
Several upcoming titles have announced support for Nvidia's hardware PhysX, which could be good news for the GPU maker. However, up until this point, games supporting PhysX have been a mixed bag, perhaps leading to a sense of apathy among gamers. Or at least that's what AnadTech's newest poll seems to suggest.
When asked how important hardware PhysiX acceleration is in buying software, 52 percent of the nearly 9,000 respondents said it was only "Marginal; PhysX is a bonus if a game I like supports it." Thirty-one percent took it a step further calling PhysX 'Not useful,' and 3 percent said it was "Detrimental." Only 13 percent found PhysX 'Useful,' 'Important,' or 'Very Important.'
Things weren't much better (for Nvidia) when the same question was asked about making a hardware buying decision. A slightly less 79 percent of respondents found PhysX to be anywhere from a marginal to detrimental marketing bullet. And the responses weren't overly swayed by ATI videocard owners, either. According to current poll results, 52 percent of respondents own an Nvidia card with support for PhysX.
Earlier this year, Maximum PC Editor-in-Chief Will Smith challenged Nvidia "to stop trying to convince us that closed APIs are good, and instead embrace OpenCL." Fast forward to today and the graphics chip maker still isn't ready to kill CUDA, but it did become the first to release an OpenCL driver and Software Development Kit (SDK) in pre-beta form. Nvidia says its goal is to solicit early feedback in anticipation of a beta release to be made available in coming months.
"The OpenCL standard was developed on Nvidia GPUs and Nvidia was the first company to demonstrate OpenCL code running on a GPU," said Tony Tamasi, senior VP of technology and content at Nvidia. "Being the first to release an OpenCL driver to developers cements Nvidia's leadership in GPU Computing and is another key milestone in our ongoing strategy to make the GPU the soul of the modern PC."
If you haven't been following along at home, OpenCL is short for Open Computing Language and is an open programming framework paving the way for developers to tap into the power of GPUs for general-purpose computing, otherwise known as GPGPU (General Purpose GPU). The open standard has the potential to work on most modern GPUs, and not just Nvidia hardware like the company's CUDA platform. But don't read this as Nvidia giving up on CUDA. On the contrary, Nvidia feels OpenCL reinforces the ideas behind CUDA, and has bumped up the CUDA release schedule to include three releases planned for 2009.
At this month’s GDC AMD and Havok teamed up to show off the latest advances in their development of OpenCL, a new programming language that will allow physics processing to swap from the CPU to GPU on the fly.
The concept behind OpenCL is simple; it’s a system that will allow the load from physics processing to shift from different pieces of hardware on the fly. For example, if a gamer has a high end GPU but a slower processor, OpenCL can detect this and move a bulk (if not all) of the physics processing to the GPU, alleviating some of the stress from the CPU. And this system works vice versa, for slower GPUs but high end CPUs.
What’s even better is that OpenCL will work across all platforms. While PhysX currently only works with Nvidia GPU’s, OpenCL will work with AMD and Intel processors, as well as Nvidia and ATI GPUs. So, no more concerns about compatibility!
Sadly, at GDC the demo that was on display was only on an individual piece of hardware, the switch between CPU to GPU wasn’t shown. AMD was clear to state that their demo was only a proof-of-concept, and that the development process is still ongoing.
Open-source software and hardware are common elements of the technological world. And now the ancient counterparts to these modern products, written books, have finally jumped the gap into open waters as well. Virginia's currently accepting public comments for its first-ever open-source school textbook, "21st Century Physics FlexBook: A Compilation of Contemporary and Emerging Technologies."
Licensed under Creative Commons Attribution Share Alike, the project ran from concept to creation in a little under four months. That's an impressive amount of time for the creation of a full textbook. Thirteen teachers from Virginia's K-12 physics community joined up with university and industry volunteers across a number of states to develop the book's eleven chapters. And each chapter was given no less than three peer reviews from college professors, related authors, and high school and college students.