Nvidia’s secret war with Intel has evolved into a full scale arms race for the atomic bomb of graphics technology, ray tracing. Using its forum at SIGGRAPH, Nvidia was able to demonstrate an interactive ray tracing simulation using four of the company's next-generation Quadro GPUs. They were set in a Quadro Plex 2100 D4 Visual Computing System with an estimated street price of around $11,000. Not exactly your standard gaming rig, but it gets the point across. Either way, it appears as though Nvidia is finally taking a cue from Intel and is focusing at least some of its effort on developing hardware capable of making this technique a reality for everyday users. The demonstration featured linear scaling of an anti aliased Bugatti Veyron with over two-million polygons. It was run at a resolution of 1920x1080 (1080p) and chugged along at an impressive 30 FPS. The demonstration also featured image-based lighting paint shaders, reflections / refractions, and ray traced shadows. Industry insiders noted that the demo was an impressive undertaking since it was one of the first interactive demonstrations done using a GPU. Intel has demonstrated ray tracing using Quake 3 but was done using CPU power.Larrabee will be Intel’s counter in the consumer market, but it remains to be seen if the CPU style design will be as capable of pushing out polygons as Nvidia’s offerings.Gamers are no doubt hoping the new race to master ray tracing will accelerate its development, but I have a feeling we will be playing Duke Nukem Forever long before we see consumer based ray tracing solutions from either company. Though the important first steps are now well underway.
If not for the fact that I was able to actually make physical contact with David Hayter at this year's Capcom E3 press conference, it would've been a total letdown. The whole thing was just a giant shill for Capcom's Lost Planet film, and its reception was nearly as icy cold as the movie/game's setting. But in between cracking big, corporate grins and repeatedly uttering the Japanese equivalent of "So awesome," the Capcom big-wigs dropped a tiny bomb. See, as it turns out, Lost Planet had popped from Capcom's collective womb with a ticket to Hollywood in hand. The game was born to be a film.
As we've seen with movies like Doom and Resident Evil, and games like Guitar Hero, media convergence is inevitable. United we stand; divided, we make less money. And that just won't do. However, whereas other instances of convergence have taken two (or more) disparate media forms and none-too-subtly mashed them together -- casualties be damned -- Lost Planet, if all goes according to plan, will straddle the line between games and film. Instead of removing what makes the game special -- effectively neutering it with a rusty knife -- Lost Planet: The Movie has the potential to usher in an era of game-themed movies not unlike what we're seeing with comic books right now.
But is that what we want? Last I checked, comic book fans were a tiny niche, nearly fit for a somber, "Don't let these beautiful creatures die" commercial from the World Wildlife Fund. Yeah, I'm not sure comics are the greatest role model. Plus, do we really want cherished characters having their in-game appearances altered just so they can more aptly fit their roles as movie characters (See Nick Fury, among others)?*
So, are you ready for some top-notch game-to-movie conversions, or would you rather our hobby stick to the small screen, interactive and proud?
Today's Roundup features a big-name title that's already being preened for stardom, and wouldn't you know it, Electronic Arts is the, er, preener. Inside, you'll also find Rockstar decrying the hardcore/casual divide, a top-15 list of Olympic proportions, and massive success from a WoW competitor. Hurdle past the break for more.
Wondering what's going on inside the mystery that we call "Windows 7?" You could do worse than dropping in from time to time on the brand-new Engineering Windows 7 blog hosted at MSDN.
E7, as its co-authors, Windows 7 product senior engineering managers, Jon DeVann and Steven Sinofsky, call it, is aimed at the "...audience of enthusiasts, bloggers, and those that are the most passionate about Windows..."
We strongly believe that success for Windows 7 includes an open and honest, and two-way, discussion about how we balance all of these interests and deliver software on the scale of Windows. We promise and will deliver such a dialog with this blog.
Starting from the first days of developing Windows 7, we have committed as a team to “promise and deliver”. That’s our goal—share with you what we’re going to get done, why we’re doing it, and deliver it with high quality and on time.
Can they deliver - not just on the expectations we have for Windows 7, but on the promise to keep us in the loop during the run-up to RTM? Find out what others have to say about that, and get your chance to speak up after the jump.
The whole world went gaga over the PS3’s Cell processor at the advent of the 7th generation of consoles. That hype slowly subsided as the PS3 failed to set the cash registers ringing. However, an imminent deluge of Cell-based products - Toshiba's latest Qosmio notebooks bear a Cell-derived chip - has turned the spotlight back on to the Cell Broadband Engine.
SIGGRAPH 2008 brings roughly 30,000 computer graphics and interactive technology professionals from around the world together focusing on science, research, art, animation, gaming, and education. They hand out prizes for the best in computer animation from submissions. This year’s conference ends today.
"The caliber of submissions this year was truly phenomenal, which made the jury's job especially difficult." said Jill Smolin, SIGGRAPH 2008 Conference Entertainment Director. "The winners truly showcase what is possible today and provide a glimpse into what artists can achieve in the future. Really, the only limitation is the imagination."
The SIGGRAPH 2008 Best of Show award goes to Oktapodi by Gobelins, l'école de l’image, from France.
Best Student Piece Winner award goes to 893 by Supinfocom from France.
The Jury Award Winner is Mauvais Rôle by École Supérieure de Réalisation Audiovisuelle, also from France.
It seems the French made a clean sweep. The conference was held in Los Angeles. Go figure.
Oktapodi and Mauvais Rôle are both pretty entertaining. although I found Mauvais Rôle funnier. It must have been the gamer references. I couldn’t find a link to 893. Check them out!
Coming this fall, Sony will unveil its first WHDI device, the DMX-WL1T. If you haven't been following, WHDI is a new technology co-developed by Amimon, Hitachi, Motorola, Samsung, Sharp, and Sony that provides a high-quality, uncompressed wireless link for transmitting video data rates of up to 3Gbps between an HD source and an HDTV.
Giving the device widespread flexibility, Sony's DMX-WL1T will come equipped with four HDMI inputs, one component input, one digital audio input, and a stereo analog input. The two-piece system will transmit uncompressed 1080i video and audio, but according to Sony Insider, HD content will likely only stream to a Sony DMex compatible Bravia HDTV.
Concrete details have yet to emerge, but it looks as though the WHDI device will offer a range of up to 100 feet and possibly more. Three IR Blaster ports also suggest that users will be able to control other third-party devices. Sony is expected to officially announce the DMX-WL1T later this month at the IFA conference in Berlin. Until then, it's all speculation, including pricing and availability.
There are all kinds of gadgets and gizmos and for the visually impaired, and thanks to designer Chueh Lee at Samsung China, those who can't see might soon be able to take pictures. The Touch Sight camera doesn't come with an LCD, instead displaying snapshots as a three-dimensional image by embossing the surface of a built-in Braille display.
"Touch Sight is a revolutionary digital camera designed for visually impaired people," said Lee. "Simpe features make it easy to use, including a unique feature which records sound for three seconds after pressing the shutter button. The user can then use the sound as a reference when reviewing and managing the photos."
Visually impaired photographers are advised to hold the camera up against their forehead, similar to having a third eye, as the best way to stabilize and aim the camera. Once the pictures are snapped, the touchable photos are saved to the camera and the ones worth sharing can be uploaded for other Touch Sight camera owners to download and feel.
Kudos to Lee for one of the grooviest gadgets we've seen recent times.
Forget any talk of shortages or competitive pressure from VIA, Intel's Atom processors are thriving amid the recent Netbook and Mobile Internet Device (MID) movement. "Atom is off to a very, very rapid start, far exceeding our expectations when we started the year," CFO Stacy Smith said in an interview Tuesday. "It's the perfect recession product to have in the marketplace."
The success of its Atom processor has helped Intel achieve a 25 percent rise in quarterly profit despite a weak global economy, with Smith maintaining an overall revenue forecast in the third quarter between $10.0 and $10.6 billion.
Yields are good too. According to Smith, Intel gets about 2,500 Atom processors per silicon wafer, and while that's not quite as good as on a Core or Xeon chip, it's enough to ensure strong profitability on Atom CPUs. Still, Intel remains cautiously optimistic.
"We'll know kind of in six months how much of this demand (for Atom) is real and how much is customers thinking they're going to win in the market place and double-ordering," Smith said. "It seems to be growing the market rather than cannibalizing existing PC sales."
Will Intel's Atom chips continue to exceed expectations now that Centrino 2 platforms are starting to trickle out?
It might not be well publicized, but there's a major war brewing between Microsoft and Adobe, and they're fighting for you. Each one of them wants to be your provider for rich media content, a task that has traditionally been served by Adobe with its Flash player, but one Olympic sized loss could change the game in Microsoft's favor.
It was Microsoft who won the deal to supply NBC with video-viewing technology via Silverlight for the Olympics in Beijing, and while Microsoft and NBC have ties that go back to their collaboration building MSNBC, Adobe could have been considered a favorite to the win the account based the mature nature of Flash technology. So how did Microsoft secure the gold?
"We talked about features like adaptive streaming, the ability to automatically keep checking how much bandwidth you have and deliver the appropriate quality stream and how to be smart about knowing what's coming up in the stream," said Rob Bennett, the general manager of sports for MSN.
In other words, Microsoft won the account on a combination of Silverlight's feature-set, and convincing NBC that Flash's scalability had never been put to an Olympic-size test, unlike Silverlight's underlying technology which is based on Windows Media technologies.
Of course, it's only one account, but it's not so much what Adobe lost, but what Microsoft gained. While download specifics have not been disclosed, we do know that it's registering 1.5 million downloads a day, and according to a spokeswoman for Microsoft, "in the last several days, more than 50 percent of the visitors to NBCOlympics.com on MSN already have Silverlight 2 installed."
We are consuming huge amounts of bandwidth daily. Just 10 years ago I would have been thrilled with a 4Mb down 512Kb up connection. Today that’s just so-so when it comes to broadband. Downloading video, music, or whatever, is consuming massive amounts of bandwidth and communications companies are working hard to keep up. It’s only going to get more crowded on our current system.
Fiber optics is the big thing for moving large amounts of data around. After all, there isn’t anything that is faster than light (without getting into Quantum physics…). The internet’s current speed woes comes from routing information to its various destinations, not transporting it.
Fiber optics still relies on regular routers to relay information to its correct destination. Where fiber optics can handle frequencies in the terahertz range, electronics work on the gigahertz range. Those pulses of light have to be converted into electrical signals, which are stored, routed, and turned back into optical signals with lasers to be transmitted on. The conversion, besides adding significant cost and complexity, it slows down the data transmission.
So the simple thing to do is to slow light down and remove the needed conversion process. I can hear Han Solo now, “Slow down light speed? Not on this ship brother.”
That is just what researchers are trying to do using "metamaterials". If they can slow down light during the switching process, there would be no need for the electrical conversion step. It could be a first step into building a light based computer.
You can catch the whole article on the BBC website here.