For the past couple of years, a team of HP scientists have sat tucked away in a laboratory with the sole goal of pushing memristor research. What exactly is a memristor? Put simply, it's an electrical resistor with memory properties, and according to HP, memristors could push the speed of flash-based media tenfold or higher.
"This is sort of the missing element of the processor puzzle. It takes its place alongside the resistor, capacitor and inductor [as the fourth basic circuit element in chip engineering]. And it could change the way we do IT," said Stan Williams, HP senior fellow and Director of quantum Science Research.
Williams made those comments during the Flash Memory Summit in August 2009, and now less than a year later, Williams said they have discovered that the memristor "has more capabilities than we previously thought." No longer do Williams and Co. think memristors will just apply to storage devices, but they say "the memristor can perform logic, enabling computation to one day be performed in chips where data is stored, rather than on a specialized central processing unit."
If they're right, this could end up extending Moore's Law even after it's no longer possible to shrink transistors, Williams said.
With companies like Asus and Gigabyte all gung-ho to push USB 3.0 into every household, you'd expect the new spec would have an easy time marching into the mainstream. Unfortunately, that isn't the case, and you're not likely to see USB 3.0 become widespread until the tail end of 2011. The reason? No direct support from Intel.
"The real sweet spot of a new version of USB comes when it is integrated into the chipset of the PC," said Brian O'Rourke, an analyst at In-Stat. "That's when USB becomes mainstream. By integrating it into the its chipsets, Intel essentially allows PC OEMs to offer that new flavor of USB for free.
But according to O-Rourke, Intel isn't expected to do this until late 2011. Whether or not that's really the time line, Intel won't say, but at least one analyst believes that USB 3.0 just isn't a priority for Intel.
"USB 2.0 is doing a pretty good job for most people," according to Brookwood. And what about HD camcorders and HD digital cameras, which can benefit from the extra transfer speed that USB 3.0 offers? "Those people are typically willing to pay a premium for high-end systems that have USB 3.0."
According to Cisco, IT departments are crying out for more collaboration tools, even though many employees feel constrained by corporate policies.
Be that as it may, a recent global study by Cisco suggests that 77 percent of IT decision makers plan to increase spending on collaboration tools this year. Left to their own devices, more than a quarter of those surveyed who work at organizations that prohibit the use of social media applications admitted altering the settings on their corporate gadgets in order to gain access, saying they "need the tools to get the job done."
Of those who said they expect spending to increase on collaboration tools, 56 percent said such spending would likely increase by at least 10 percent, and probably more. India and China seem to be the most progressive in adopting the technology, Cisco said, though the majority of IDTMs recognize the importance of collaboration tools, specifically the need for better video conferencing equipment, Web conferencing, and Internet Protocol telephony.
It's here, ladies and gentlemen - the Khronos Group today announced the release of the OpenGL 4.0 specification at GDC 2010 in San Francisco.
In short, the latest iteration "brings the very latest in cross-platform graphics acceleration and functionality" to PCs and workstations, but if you're looking for a bullet list of geeky details, we have you covered. Some of the benefits include:
two new shader stages that enable the GPU to offload geometry tessellation from the CPU;
per-sample fragment shaders and programmable fragment shader input positions for increased rendering quality and anti-aliasing flexibility;
drawing of data generated by OpenGL, or external APIs such as OpenCL, without CPU intervention;
shader subroutines for significantly increased programming flexibility;
separation of texture state and texture data through the addition of a new object type called sampler objects;
64-bit double precision floating point shader operations and inputs/outputs for increased rendering accuracy and quality;
performance improvements, including instanced geometry shaders, instanced arrays, and a new timer query.
"The release of OpenGL 4.0 is a major step forward in bringing state-of-the-art functionality to cross-platform graphics acceleration, and strengthens OpenGL’s leadership position as the epicenter of 3D graphics on the web, on mobile devices as well as on the desktop," said Barthold Lichtenbelt, OpenGL ARB working group chair and senior manager Core OpenGL at NVIDIA. “NVIDIA is pleased to announce that its upcoming Fermi-based graphics accelerators will fully support OpenGL 4.0 at launch."
So what does this all mean for Joe Gamer? That remains to be seen, and ultimately decided by developers. OpenGL 4.0 has DirectX 11 in its sights, and Khronos has no qualms about saying so. "OpenGL 4.0 exposes the same level of capability of GPUs as DirectX 11," the company said during a presentation at GDC.
What's the over/under on how long the Large Hadron Collider (LHC) stays running? We don't know the answer, but we'd be inclined to take the 'under' bet every time. In the latest bit of bad news, the atom smashing machine will have to be shut down at the end of 2011 for up to year in order to address design issues.
"It's something that, with a lot more resources and with a lot more manpower and quality control, possibly could have been avoided but I have difficulty in thinking that this is something that was a design error," said Dr. Steve Myers, a director of the European Organization for Nuclear Research.
"The standard phrase is that the LHC is its own prototype. We are pushing technologies towards their limits.
"You don't hear about the thousands or hundreds of thousands of other areas that have gone incredibly well. With a machine like LHC, you only build one and you only build it once."
Point well taken, but no less disappointing. Following 14-months of inaction, the machine was only recently restarted, but issues remain. Joints between the machine's magnets need to be strengthened before higher-energy collisions can take place. In the meantime, the decision has been made to run the LHC for 18 to 24 months at half power before pulling the plug for a year to make the necessary improvements. Bummer.
Despite a lingering recession, Microsoft isn't holding back when it comes to spending. According to Kevin Turner, Microsoft's chief operating officer, the Redmond giant will spend around $9.5 billion on research and development this year, which is about $3 billion more than the next closest tech company.
"Especially in light of the tough difficult macroeconomic times that we're coming out of, we chose to really lean in and double down on our innovation," Turner said.
Much of that investment will go towards the cloud, an area Turner sees his company becoming a leader in as it tries to "change and reinvent" itself. Turner also added that Microsoft will still maintain a significant on-premise software business, even as companies such as Google look to cloud-only software solutions.
Forget about traditional touchscreen displays, laser keyboards, and gesture-based controls. None of those have the same wacky sci-fi appeal as "Skinput," the new self-touch input method Carnegie Mellon University and Microsoft are tag teaming.
Skinput is essentially a touchscreen interface for your flesh, but don't worry, it doesn't require any surgery or limb replacements. Instead, a microchip-sized pico projector beams images onto your skin. When you tap on these, the signals get picked up by the special armband equipped with a bio-acoustic sensing array built into it.
"We resolve the location of finger taps in the arm and hand by analyzing mechanical vibrations to propagate through the body," the research team states in their abstract. "We collect these signals using a novel array of sensors worn as an armband.This approach provides an always available, naturally portable, and on-body finger input system."
The armband contains five piezoelectric cantilevers, each one weighted to respond to certain bands of sound frequencies. A different combination of sensors are triggered depending on where you tap yourself.
Samsung on Monday announced what it claims is the industry's first 30nm class DRAM to successfully complete customer evaluations in 2Gb (gigabit) densities.
"Our accelerated development of next generation 30nm-class DRAM should keep us in the most competitive position in the memory market," said Soo-In Cho, president, Memory Division, Samsung Electronics. "Our 30nm-class process technology will provide the most advanced low-power DDR3 available today and therein the most efficient DRAM solutions anywhere for the introduction of consumer electronics and server systems."
According to Samsung, shrinking down to a 30nm manufacturing process allows the company to raise production by 60 percent over 40nm-class DDR3. And as far as consumers are concerned, the company's Green DRAM lowers power consumption by up to 30 percent over 50nm-class DRAM. To give a real world example, Samsung says a 4GB, 30nm module will consume only 3W per hour in a new generation notebook.
IBM on Friday announced it has inked a new agreement with the ABB Group, a global provider of automation technologies, to transform the company's Information Systems (IS) infrastructure across 17 countries in Europe, North America, and Asia Pacific.
"With the new agreement, ABB will realize considerable savings, while harmonizing and optimizing IS infrastructure," said Haider Rashid, ABB's global Chief Information Officer. "Our partnership with IBM allows us to implement new technologies and processes to build for continued globalization of our business. At the same time, we will be improving energy efficiency."
ABB said it expects immediate cost savings as a result of the new agreement, while it also puts ABB in a better position to utilize cloud computing down the line.
Chances are you've heard of graphene transistors before, and that's because the technology's touted as capable of one day replacing silicon. IBM Research has just overcome one of the biggest roadblocks in getting to that point, who claims to have opened a "bandgap" for carbon-based graphene field-effect transistors (FETs),
"Graphene doesn't naturally have a bandgap, which is necessary for most electronic applications," said IBM Fellow Phaedon Avouris. "But now we can report turnable electrical bandgaps of up to 130meV for our bi-layer graphene FETs. And larger bandgaps are certainly feasible."
Avrouis says this latest breakthrough swings the door wide open for the future use of graphen in digital electronics and optoelectronics devices.