Perhaps in the not too distant future, you'll be able to coax a few more minutes of talk time from your smartphone by jogging around the block. That's because researchers at the University of Georgia Institute of Technology have come up with a way of generating enough energy to power portable devices by walking or running.
What they've done is develop tiny nanowries constructed of zinc oxide. These wires can generate an electric field through force or motion, converting mechanical energy into electrical energy.
"Any physical action that bends the substrate creates energy," said Zhong Lin Wang, professor and director of the Center for Nanostructure Characterization at Georgia Institute of Technology. Wang went on to explain that the electricity output depends on the number of nanowires and how strong the materials are.
There haven't been any field tests yet, but within two to three years, the researchers think they'll have substrates ready small and stable enough to integrate into low-power devices like Bluetooth transmitters. And what about those smartphones and other similar sized portable devices? We're looking at five years down the road, Wang said.
We fully grasp that, as a human race, we're intelligent enough to devise a way for cow pies to be used to power a data center, but what we really want to know is how anyone in HP's ranks kept a straight face while discussing the "manure output of cows." Not only did HP talk about this internally, but the company went and drew up an entire game plan, and who knows, that smell seeping in through your car windows as you drive down the interstate might be coming for a new data center.
"The idea of using animal waste to generate energy has been around for centuries, with manure being used every day in remote villages to generate heat for cooking. The new idea that we are presenting in this research is to create a symbiotic relationship between farms and the IT ecosystem that can benefit the farm, the data center and the environment," said Tom Christian, principal research scientist, Sustainable IT Ecosystem Lab, HP.
According to HP, 10,000 dairy cows produce enough waste to power a 1 megawatt (MW) data center, which is the equivalent of a medium-sized data center, and still have power left over to support other needs on the farm. And get this - the heat generated by the data center can be used to increase the efficiency of the anaerobic digestion of animal waste. Yep, warm manure is just what the IT industry needs.
On very much a related note, did you know that the average dairy cow produces about 120 pounds of manure each day, and about 20 metric tons per year? That's enough to generate 3 kilowatt-hours (kWh) of electrical energy.
We're not sure whether to laugh, cry, or sign up for an Old Glory insurance plan, but apparently it's now possible to be legally wed by a robot, at least in Tokyo.
The couple in question are both connected with Japan's robotics industry, and now they're legally connected to each other, all thanks to an automated preacher known as I-Fairy.
According to BBC News, I-Fairy, with its flashing eyes and plastic pigtails, instructed the groom to lift the bride's veil for the customary kiss after the two tied the knot. C-3PO was no where be found, nor was Robby, Cylon, or any of these other notable robots.
It was on May 16, 1960 that Huges Lab researcher Theodore Maiman built the world's first laser using a flash lamp to simulate a pink ruby rod. Now 50 years later, lasers have become a prominent fixture in every day life. Here's a quick look at some of the lasers hits and misses.
Back before Val Kilmer largely fell off the acting radar, he starred in a movie called Real Genius as a laid back college student working on a chemical laser. In the final seen, Kilmer and fellow students fill their corrupt professor's house with a huge tin of popcorn and pop it using a laser, which fills the entire home. Sure, Mythbusters went on to debunk the possibility of something like this happening in real life, but the original scene is no less awesome, or memorable. Watch the clip here.
Anyone remember Kenwood's 72X TrueX optical drive? We do, and we named it one of the top 10 tech blunders because of its high failure rate, disappointing speeds, and poor media compatibility. The technology seemed promising at the time, which took a normal red laser diode and sent it through a diffraction grating splitting it into seven parts, but the technology just couldn't live up to the hype.
In late 2008, Microsoft took blue lasers mainstream with its Sidewinder X8 mouse. Our verdict? A disappointing 5. Read the review here.
So we weren't overly impressed with Microsoft's X8, but we absolutely loved (and still do) Logitech's G9x, the only rodent we've ever awarded with a perfect 10 verdict. The G9x replaces the original G9's 3,200dpi laser sensor with a 5,000dpi laser senor, resulting in ultra-sensitive movement when you need it. Read the review here.
So where lasers go from here? Time will only tell, but here's to another 50 years of awesome innovation (and wacky Hollywood portrayals).
Kenneth Crocker is in the books as the first person ever to undergo remote heart surgery, which was performed at the Glenfield Hospital in Leicestershire to correct an irregular heartbeat, UK's Daily Mail reports.
A 3-foot robotic arm shoved a thin surgical tube into the 70-year-old patient's body while the surgeon sat in a separate room controlling the delicate procedure with a remote control, steering the tube through a vein into the heart. By performing the procedure remotely, the surgeon was isolated from dangerous levels of radiation from the more than 250 X-rays to monitor the location of the probe for up to eight hours.
"I've been very excited about the operation for weeks," Crocker explained. "It's a little bit of extra magic being the first in the world. I tried cardioversion, which is electric shock therapy, and different medicines to get rid of the problem but so far nothing has worked. I've seen the robotic arm and it's an impressive piece of kit. I'd like to shake hands with it after when I'm cured but maybe that won’t be possible."
Given the success of the surgery, Dr. Ng, the surgeon who performed the operation, said the technique could be used to treat up to 50,000 Britons diagnosed with an irregular heartbeat each year, potentially reducing strokes and heart failure.
GE claims to have developed an LED light bulb that distributes light like an incandescent bulb, but doesn't need to be changed for 17 years (4 hours per day). The bulb sips just 9 watts and provides a 77 percent energy savings, all while providing about the same light output as a 40W incandescent, GE says.
"This is a bulb that can virtually light your kid's bedroom desk lamp from birth through high school graduation," says John Strainic, global product general manager, GE Lighting. "It's an incredible advancement that's emblematic of the imagination and innovation that GE's applying to solve some of the world's biggest challenges."
The LED bulb sports a funky aesthetic, and there's good reason for that. According to GE, the fins around the side help direct light downward on the intended surface and all around rather than beam light out the top of a lampshade like most current LED bulbs do.
Look for GE's LED bulb to ship this Fall or early 2011 for around $40 to $50.
For the past couple of years, a team of HP scientists have sat tucked away in a laboratory with the sole goal of pushing memristor research. What exactly is a memristor? Put simply, it's an electrical resistor with memory properties, and according to HP, memristors could push the speed of flash-based media tenfold or higher.
"This is sort of the missing element of the processor puzzle. It takes its place alongside the resistor, capacitor and inductor [as the fourth basic circuit element in chip engineering]. And it could change the way we do IT," said Stan Williams, HP senior fellow and Director of quantum Science Research.
Williams made those comments during the Flash Memory Summit in August 2009, and now less than a year later, Williams said they have discovered that the memristor "has more capabilities than we previously thought." No longer do Williams and Co. think memristors will just apply to storage devices, but they say "the memristor can perform logic, enabling computation to one day be performed in chips where data is stored, rather than on a specialized central processing unit."
If they're right, this could end up extending Moore's Law even after it's no longer possible to shrink transistors, Williams said.
With companies like Asus and Gigabyte all gung-ho to push USB 3.0 into every household, you'd expect the new spec would have an easy time marching into the mainstream. Unfortunately, that isn't the case, and you're not likely to see USB 3.0 become widespread until the tail end of 2011. The reason? No direct support from Intel.
"The real sweet spot of a new version of USB comes when it is integrated into the chipset of the PC," said Brian O'Rourke, an analyst at In-Stat. "That's when USB becomes mainstream. By integrating it into the its chipsets, Intel essentially allows PC OEMs to offer that new flavor of USB for free.
But according to O-Rourke, Intel isn't expected to do this until late 2011. Whether or not that's really the time line, Intel won't say, but at least one analyst believes that USB 3.0 just isn't a priority for Intel.
"USB 2.0 is doing a pretty good job for most people," according to Brookwood. And what about HD camcorders and HD digital cameras, which can benefit from the extra transfer speed that USB 3.0 offers? "Those people are typically willing to pay a premium for high-end systems that have USB 3.0."
According to Cisco, IT departments are crying out for more collaboration tools, even though many employees feel constrained by corporate policies.
Be that as it may, a recent global study by Cisco suggests that 77 percent of IT decision makers plan to increase spending on collaboration tools this year. Left to their own devices, more than a quarter of those surveyed who work at organizations that prohibit the use of social media applications admitted altering the settings on their corporate gadgets in order to gain access, saying they "need the tools to get the job done."
Of those who said they expect spending to increase on collaboration tools, 56 percent said such spending would likely increase by at least 10 percent, and probably more. India and China seem to be the most progressive in adopting the technology, Cisco said, though the majority of IDTMs recognize the importance of collaboration tools, specifically the need for better video conferencing equipment, Web conferencing, and Internet Protocol telephony.
It's here, ladies and gentlemen - the Khronos Group today announced the release of the OpenGL 4.0 specification at GDC 2010 in San Francisco.
In short, the latest iteration "brings the very latest in cross-platform graphics acceleration and functionality" to PCs and workstations, but if you're looking for a bullet list of geeky details, we have you covered. Some of the benefits include:
two new shader stages that enable the GPU to offload geometry tessellation from the CPU;
per-sample fragment shaders and programmable fragment shader input positions for increased rendering quality and anti-aliasing flexibility;
drawing of data generated by OpenGL, or external APIs such as OpenCL, without CPU intervention;
shader subroutines for significantly increased programming flexibility;
separation of texture state and texture data through the addition of a new object type called sampler objects;
64-bit double precision floating point shader operations and inputs/outputs for increased rendering accuracy and quality;
performance improvements, including instanced geometry shaders, instanced arrays, and a new timer query.
"The release of OpenGL 4.0 is a major step forward in bringing state-of-the-art functionality to cross-platform graphics acceleration, and strengthens OpenGL’s leadership position as the epicenter of 3D graphics on the web, on mobile devices as well as on the desktop," said Barthold Lichtenbelt, OpenGL ARB working group chair and senior manager Core OpenGL at NVIDIA. “NVIDIA is pleased to announce that its upcoming Fermi-based graphics accelerators will fully support OpenGL 4.0 at launch."
So what does this all mean for Joe Gamer? That remains to be seen, and ultimately decided by developers. OpenGL 4.0 has DirectX 11 in its sights, and Khronos has no qualms about saying so. "OpenGL 4.0 exposes the same level of capability of GPUs as DirectX 11," the company said during a presentation at GDC.