Earlier this year, Maximum PC Editor-in-Chief Will Smith challenged Nvidia "to stop trying to convince us that closed APIs are good, and instead embrace OpenCL." Fast forward to today and the graphics chip maker still isn't ready to kill CUDA, but it did become the first to release an OpenCL driver and Software Development Kit (SDK) in pre-beta form. Nvidia says its goal is to solicit early feedback in anticipation of a beta release to be made available in coming months.
"The OpenCL standard was developed on Nvidia GPUs and Nvidia was the first company to demonstrate OpenCL code running on a GPU," said Tony Tamasi, senior VP of technology and content at Nvidia. "Being the first to release an OpenCL driver to developers cements Nvidia's leadership in GPU Computing and is another key milestone in our ongoing strategy to make the GPU the soul of the modern PC."
If you haven't been following along at home, OpenCL is short for Open Computing Language and is an open programming framework paving the way for developers to tap into the power of GPUs for general-purpose computing, otherwise known as GPGPU (General Purpose GPU). The open standard has the potential to work on most modern GPUs, and not just Nvidia hardware like the company's CUDA platform. But don't read this as Nvidia giving up on CUDA. On the contrary, Nvidia feels OpenCL reinforces the ideas behind CUDA, and has bumped up the CUDA release schedule to include three releases planned for 2009.
After three years of service, ex-Google Visual Design Lead Douglas Bowman parted ways with the search giant last Friday, while also offering some parting thoughts about the company and his decision to move on. His reason for leaving? Not enough creative freedom.
In a blog post, Bowman laments the process of how Google implemented design decisions, saying the company relied too much on data and not enough on subjectivity. He says the reliance on hard numbers ultimately became a crutch that prevented Google from making any daring design decisions.
"Yes, it's true that team at Google couldn't decide between two blues, so they're testing 41 shades between each blue to see which one performs better," Bowman wrote on his blog. "I had a recent debate over whether a border should be 3, 4, or 5 pixels wide, and was asked to prove my case. I can't operate in an environment like that. I've grown tired of debating such minuscule design decisions."
Despite his design philosophy criticism, Bowman says he understands where Google is coming from with billions of shareholder dollars at stake and millions of users around the world to try and please. He also says he has something else lined up, which he'll announce at a later date.
Unlike the chicken and the egg, in today's multicore environment, we can definitively say the hardware came first, and we're beginning to wonder if the software will ever come at all. We're not referring to the handful of games and applications that are multicore friendly, but the widespread development of software to take advantage of multiple cores.
So what's the holdup? According to participants at last week's Multicore Expo in Santa Clara, California, programming challenges remain. While there's no shortage of multicore processors in the wild, much of the software being written is still being geared towards single-core computing.
"Looking at the specifications for these software products, it is clear that many will be challenged to support the hardware configurations possible today and those that will be accelerating in the future," said Carl Claunch, vice president and distinguished analyst at Gartner. "The impact is akin to putting a Ferrari engine in a go-cart; the power may be there, but design mismatches severely limit the ability to exploit it."
The above statement comes from a report Gartner released two months ago. In it, Claunch goes on to explain that the software running today's servers have both hard and soft limits on the number of processors the software can effectively handle, the latter of which requires trial and error to overcome.
Parallel computing may seem like a no-brainer, but programmers point to the potential of new types of software bugs and lack of programming tools. On the bright side, more tools are emerging, and both Intel and AMD have made it clear that the future of computing lies in multiple cores. That future will be realized once software development catches up to the hardware.
According to an survey conducted by the Computing Research Association, the number of majors and pre-majors in American computer science programs was up 6.2 percent from 2007. This marks the first time in six years that enrollment in computer science has increased.
"This could be a sign that we are beginning to make headway as well as increased attention, increased interest, and increased investment," said Andrew A. Chien, director of research at Intel.
Since the dot-com implosion starting in 2000, the field has seen a startling decline, leading some to warn about the effect it would have on the nation's ability to compete in the global economy. But in the past few years, there has been much effort to allay potential students' fears that computer science entails little more than sitting cooped up in front of a PC banging out code. That has helped lead to a 9.5 percent increase in the number of new undergraduate majors in computer science, and cut the decline in new bachelor's degrees from 20 percent to 10 percent.
Despite the increase, computer science remains of most interest to men, at least according to enrollment and graduation figures. Women accounted for a consistent 11.8 percent of computer science bachelor degrees in 2008.
Everyone of late has big plans for the cloud, including Mozilla, who on Thursday launched an open-soure project called Bespin. The basic idea behind Bespin is to offer a web-based programming framework that brings together the speed of desktop-based development with cloud computing. While in very early form, Mozilla has set some high-level goals for the project:
Ease of Use - the editor experience should not be intimidating and should facilitate quickly getting straight into the code.
Real-time Collaboration - sharing live coding sessions with colleagues should be easy and collaboratively coding with one or more partners should Just Work.
Integrated Command-Line - tools like vi and Emacs have demonstrated the power of integrating command-lines into editors. Bespin needs one, too.
Extensible and Self-Hosted - the interface and capabilities of Bespin should be highly extensible and easily accessible to users through Ubiquity-like commands or via the plug-in API.
Wicked Fast - the editor is just a toy unless it stays smooth and responsive editing files of very large sizes.
Accessible from Anywhere - the code editor should work from anywhere, and from any device, using any modern standards-compliant browser.
As it stands now, Bespin 0.1 is just an initial prototype framework with support for basic editing features like syntax highlighting, undo/redo, previewing files in the browser, and other low-level tasks. In the long-run, Mozilla hopes to "empower web developers to hack on the editor itself and make it their own."
Developers who want to give the early prototype a whirl can access the Bespin demo here.