It was on May 16, 1960 that Huges Lab researcher Theodore Maiman built the world's first laser using a flash lamp to simulate a pink ruby rod. Now 50 years later, lasers have become a prominent fixture in every day life. Here's a quick look at some of the lasers hits and misses.
Back before Val Kilmer largely fell off the acting radar, he starred in a movie called Real Genius as a laid back college student working on a chemical laser. In the final seen, Kilmer and fellow students fill their corrupt professor's house with a huge tin of popcorn and pop it using a laser, which fills the entire home. Sure, Mythbusters went on to debunk the possibility of something like this happening in real life, but the original scene is no less awesome, or memorable. Watch the clip here.
Anyone remember Kenwood's 72X TrueX optical drive? We do, and we named it one of the top 10 tech blunders because of its high failure rate, disappointing speeds, and poor media compatibility. The technology seemed promising at the time, which took a normal red laser diode and sent it through a diffraction grating splitting it into seven parts, but the technology just couldn't live up to the hype.
In late 2008, Microsoft took blue lasers mainstream with its Sidewinder X8 mouse. Our verdict? A disappointing 5. Read the review here.
So we weren't overly impressed with Microsoft's X8, but we absolutely loved (and still do) Logitech's G9x, the only rodent we've ever awarded with a perfect 10 verdict. The G9x replaces the original G9's 3,200dpi laser sensor with a 5,000dpi laser senor, resulting in ultra-sensitive movement when you need it. Read the review here.
So where lasers go from here? Time will only tell, but here's to another 50 years of awesome innovation (and wacky Hollywood portrayals).
Kenneth Crocker is in the books as the first person ever to undergo remote heart surgery, which was performed at the Glenfield Hospital in Leicestershire to correct an irregular heartbeat, UK's Daily Mail reports.
A 3-foot robotic arm shoved a thin surgical tube into the 70-year-old patient's body while the surgeon sat in a separate room controlling the delicate procedure with a remote control, steering the tube through a vein into the heart. By performing the procedure remotely, the surgeon was isolated from dangerous levels of radiation from the more than 250 X-rays to monitor the location of the probe for up to eight hours.
"I've been very excited about the operation for weeks," Crocker explained. "It's a little bit of extra magic being the first in the world. I tried cardioversion, which is electric shock therapy, and different medicines to get rid of the problem but so far nothing has worked. I've seen the robotic arm and it's an impressive piece of kit. I'd like to shake hands with it after when I'm cured but maybe that won’t be possible."
Given the success of the surgery, Dr. Ng, the surgeon who performed the operation, said the technique could be used to treat up to 50,000 Britons diagnosed with an irregular heartbeat each year, potentially reducing strokes and heart failure.
GE claims to have developed an LED light bulb that distributes light like an incandescent bulb, but doesn't need to be changed for 17 years (4 hours per day). The bulb sips just 9 watts and provides a 77 percent energy savings, all while providing about the same light output as a 40W incandescent, GE says.
"This is a bulb that can virtually light your kid's bedroom desk lamp from birth through high school graduation," says John Strainic, global product general manager, GE Lighting. "It's an incredible advancement that's emblematic of the imagination and innovation that GE's applying to solve some of the world's biggest challenges."
The LED bulb sports a funky aesthetic, and there's good reason for that. According to GE, the fins around the side help direct light downward on the intended surface and all around rather than beam light out the top of a lampshade like most current LED bulbs do.
Look for GE's LED bulb to ship this Fall or early 2011 for around $40 to $50.
For the past couple of years, a team of HP scientists have sat tucked away in a laboratory with the sole goal of pushing memristor research. What exactly is a memristor? Put simply, it's an electrical resistor with memory properties, and according to HP, memristors could push the speed of flash-based media tenfold or higher.
"This is sort of the missing element of the processor puzzle. It takes its place alongside the resistor, capacitor and inductor [as the fourth basic circuit element in chip engineering]. And it could change the way we do IT," said Stan Williams, HP senior fellow and Director of quantum Science Research.
Williams made those comments during the Flash Memory Summit in August 2009, and now less than a year later, Williams said they have discovered that the memristor "has more capabilities than we previously thought." No longer do Williams and Co. think memristors will just apply to storage devices, but they say "the memristor can perform logic, enabling computation to one day be performed in chips where data is stored, rather than on a specialized central processing unit."
If they're right, this could end up extending Moore's Law even after it's no longer possible to shrink transistors, Williams said.
With companies like Asus and Gigabyte all gung-ho to push USB 3.0 into every household, you'd expect the new spec would have an easy time marching into the mainstream. Unfortunately, that isn't the case, and you're not likely to see USB 3.0 become widespread until the tail end of 2011. The reason? No direct support from Intel.
"The real sweet spot of a new version of USB comes when it is integrated into the chipset of the PC," said Brian O'Rourke, an analyst at In-Stat. "That's when USB becomes mainstream. By integrating it into the its chipsets, Intel essentially allows PC OEMs to offer that new flavor of USB for free.
But according to O-Rourke, Intel isn't expected to do this until late 2011. Whether or not that's really the time line, Intel won't say, but at least one analyst believes that USB 3.0 just isn't a priority for Intel.
"USB 2.0 is doing a pretty good job for most people," according to Brookwood. And what about HD camcorders and HD digital cameras, which can benefit from the extra transfer speed that USB 3.0 offers? "Those people are typically willing to pay a premium for high-end systems that have USB 3.0."
According to Cisco, IT departments are crying out for more collaboration tools, even though many employees feel constrained by corporate policies.
Be that as it may, a recent global study by Cisco suggests that 77 percent of IT decision makers plan to increase spending on collaboration tools this year. Left to their own devices, more than a quarter of those surveyed who work at organizations that prohibit the use of social media applications admitted altering the settings on their corporate gadgets in order to gain access, saying they "need the tools to get the job done."
Of those who said they expect spending to increase on collaboration tools, 56 percent said such spending would likely increase by at least 10 percent, and probably more. India and China seem to be the most progressive in adopting the technology, Cisco said, though the majority of IDTMs recognize the importance of collaboration tools, specifically the need for better video conferencing equipment, Web conferencing, and Internet Protocol telephony.
It's here, ladies and gentlemen - the Khronos Group today announced the release of the OpenGL 4.0 specification at GDC 2010 in San Francisco.
In short, the latest iteration "brings the very latest in cross-platform graphics acceleration and functionality" to PCs and workstations, but if you're looking for a bullet list of geeky details, we have you covered. Some of the benefits include:
two new shader stages that enable the GPU to offload geometry tessellation from the CPU;
per-sample fragment shaders and programmable fragment shader input positions for increased rendering quality and anti-aliasing flexibility;
drawing of data generated by OpenGL, or external APIs such as OpenCL, without CPU intervention;
shader subroutines for significantly increased programming flexibility;
separation of texture state and texture data through the addition of a new object type called sampler objects;
64-bit double precision floating point shader operations and inputs/outputs for increased rendering accuracy and quality;
performance improvements, including instanced geometry shaders, instanced arrays, and a new timer query.
"The release of OpenGL 4.0 is a major step forward in bringing state-of-the-art functionality to cross-platform graphics acceleration, and strengthens OpenGL’s leadership position as the epicenter of 3D graphics on the web, on mobile devices as well as on the desktop," said Barthold Lichtenbelt, OpenGL ARB working group chair and senior manager Core OpenGL at NVIDIA. “NVIDIA is pleased to announce that its upcoming Fermi-based graphics accelerators will fully support OpenGL 4.0 at launch."
So what does this all mean for Joe Gamer? That remains to be seen, and ultimately decided by developers. OpenGL 4.0 has DirectX 11 in its sights, and Khronos has no qualms about saying so. "OpenGL 4.0 exposes the same level of capability of GPUs as DirectX 11," the company said during a presentation at GDC.
What's the over/under on how long the Large Hadron Collider (LHC) stays running? We don't know the answer, but we'd be inclined to take the 'under' bet every time. In the latest bit of bad news, the atom smashing machine will have to be shut down at the end of 2011 for up to year in order to address design issues.
"It's something that, with a lot more resources and with a lot more manpower and quality control, possibly could have been avoided but I have difficulty in thinking that this is something that was a design error," said Dr. Steve Myers, a director of the European Organization for Nuclear Research.
"The standard phrase is that the LHC is its own prototype. We are pushing technologies towards their limits.
"You don't hear about the thousands or hundreds of thousands of other areas that have gone incredibly well. With a machine like LHC, you only build one and you only build it once."
Point well taken, but no less disappointing. Following 14-months of inaction, the machine was only recently restarted, but issues remain. Joints between the machine's magnets need to be strengthened before higher-energy collisions can take place. In the meantime, the decision has been made to run the LHC for 18 to 24 months at half power before pulling the plug for a year to make the necessary improvements. Bummer.
Despite a lingering recession, Microsoft isn't holding back when it comes to spending. According to Kevin Turner, Microsoft's chief operating officer, the Redmond giant will spend around $9.5 billion on research and development this year, which is about $3 billion more than the next closest tech company.
"Especially in light of the tough difficult macroeconomic times that we're coming out of, we chose to really lean in and double down on our innovation," Turner said.
Much of that investment will go towards the cloud, an area Turner sees his company becoming a leader in as it tries to "change and reinvent" itself. Turner also added that Microsoft will still maintain a significant on-premise software business, even as companies such as Google look to cloud-only software solutions.
Forget about traditional touchscreen displays, laser keyboards, and gesture-based controls. None of those have the same wacky sci-fi appeal as "Skinput," the new self-touch input method Carnegie Mellon University and Microsoft are tag teaming.
Skinput is essentially a touchscreen interface for your flesh, but don't worry, it doesn't require any surgery or limb replacements. Instead, a microchip-sized pico projector beams images onto your skin. When you tap on these, the signals get picked up by the special armband equipped with a bio-acoustic sensing array built into it.
"We resolve the location of finger taps in the arm and hand by analyzing mechanical vibrations to propagate through the body," the research team states in their abstract. "We collect these signals using a novel array of sensors worn as an armband.This approach provides an always available, naturally portable, and on-body finger input system."
The armband contains five piezoelectric cantilevers, each one weighted to respond to certain bands of sound frequencies. A different combination of sensors are triggered depending on where you tap yourself.