Despite a lingering recession, Microsoft isn't holding back when it comes to spending. According to Kevin Turner, Microsoft's chief operating officer, the Redmond giant will spend around $9.5 billion on research and development this year, which is about $3 billion more than the next closest tech company.
"Especially in light of the tough difficult macroeconomic times that we're coming out of, we chose to really lean in and double down on our innovation," Turner said.
Much of that investment will go towards the cloud, an area Turner sees his company becoming a leader in as it tries to "change and reinvent" itself. Turner also added that Microsoft will still maintain a significant on-premise software business, even as companies such as Google look to cloud-only software solutions.
In a system-wide assessment, based on a study of Texas’s Technology Immersion Pilot (TIP) program, researchers found that laptop availability in-school didn’t seem to improve test scores, but that when laptops were taken home student scores did improve: “Home Learning—which measured the extent of a student's laptop use outside of school for homework in each of the four core-subject areas and for learning games—was the strongest implementation predictor of reading achievement.”
A study of California students produced similar inconsistent results, but shed some light on possibilities. Laptops were used to help students over the “fourth grade slump”, which occurs when students make the transition from “learning to read to reading to learn”. No fourth grader experienced the slump, but all did, to a degree, in fifth grade--with laptop users showing smaller declines in reading comprehension than those without laptops. Overall, the students’ test score improvements were confined to things laptops were directly relevant for: literary response and analysis, and writing strategies.
The authors of this last report raise an interesting point: “what is best taught and learned with laptops is not covered on standardized tests at all.” In other words, laptops may be well suited only for some aspects of education, but these aspects aren’t yet well recognized, or aren’t being suitably tested. What needs to be done is first figure out where laptops can best be used, then measure their impact in ways amenable to the expression of this newly acquired knowledge.
Though it’s not exactly curing cancer, or sustainable energy, some researchers think it’s important not to tick off every living soul in a movie cinema because you left your phone turned on. Enter “whack gestures.”
Researchers at Carnegie Melllon University in Pittsburg and Intel Labs in Seattle have created a “whack vocabulary” of gestures used to interact with accelerometer-equipped cell phones. The functionality is simple: whack your phone to shut it up. Chris Harrison, co-developer of the system at Carnegie Mellon, says, “I think for whack gestures to be commercially viable only two gestures might be desired: one to silence the phone, and a second to postpone an alert, ask the caller to try again in 5 minutes or snooze an alarm."
I think this is a great idea. It could lead to some hilarious outbursts of self-violence, all the while making the world a little less aggravating for everybody.
Most people seem determined to prove that cell phones are out to fry our brains, but could a call or two a day actually join red wine in the united federation of healthy vices? Okay, perhaps that’s a bit of a stretch, but a new study has found that lab mice that were genetically altered to develop Alzheimer’s disease performed better on thinking and skill tests after exposure to cell phone style electromagnetic waves. “Electromagnetic waves prevent the aggregation of that bad protein of the brain” said Gary Arendash of the University of South Florida.
The study looked at the effects of cell phone use for two hours per day over a seven to nine month period, and the results were actually the opposite of what researchers were expecting. “We had expected cell phone exposure to increase the effects of dementia” claims Arendash. After decades of research there is still no cure, and few effective treatments for Alzheimer’s which is the most common form of late life dementia with over 35 million people suffering from the disease.
The evidence that cell phone radiation is safe continues to mount, but I suppose only time will tell.
The future of YouTube could be left in your hands, as well as anyone else who participates in the video sharing site's user research surveys.
The latest user experience study asked YouTube users to depict their ideal YouTube layout using printed-out features glued to magnets. Most of the participants said they "just want to watch" and that an ideal layout would consist of little more than a player and a title. But a smaller group -- mostly consisting of those who upload videos -- craved a far busier design brimming with social features, comments, descriptions, and more.
This is where you come in.
"Sometimes having users come into labs is not enough, though; we want to understand how users use YouTube in their context, in their living room, with their laptop on their lap, sprawled out on the couch," YouTube wrote in its blog. "In this case we might have field studies where we interview users in their homes."
You can take a short user survey here, and if you're interested in participating in any upcoming research, YouTube has a form you can fill out here
Since 2001 Microsoft researcher Gordon Bell has been compulsively tracking every bit of personal data that he generates in his daily life, in the interest of finding out just how much digital storage it would take to contain it.
Bell, who works at the Microsoft Silicon Valley Research Group and is calling his project MyLifeBits, has stated that “The problem isn't putting it all in. The problem is getting it out. When I started, I couldn't find anything!” Currently Bell has been able to track all the web sites that he’s visited (221,173), photos he’s taken (56,282), emails he’s sent and received (156,041), documents written and read (18,883), phone conversations had (2,000), photos snapped by a SenseCam hanging around his neck (66,000), songs he’s listened to (7,139), and videos taken by him (2,164). In order to collect all this information he users a desktop scanner, a digicam, a heart rate monitor, voice recorder, GPS logger, pedometer, smartphone and an e-reader.
He does suspect that there’s some need to forget though. Being able to wipe clean difficult memories of the past could be some evolutionary trick. “If you think you should forget, you should,” states Bell. “But for God's sake, keep all the papers you've written and the photos you take. Sometime down the road you might be looking for something and you won't even give yourself the chance of finding it.”
The team behind this project consists of researchers from institutions in the US, Singapore and China. The new LEDs, though fully inorganic, possess qualities associated with both organic and inorganic LEDs. "We wanted to see if we could use inorganic LEDs in ways that exploit some of the processing advantages of organic LEDs,” John Rogers, a materials scientist at the University of Illinois, told the journal Science.
Japanese researchers have made a major breakthrough that could prove to be a watershed in the development of flexible OLEDs. Scientists from the Center for Future Chemistry at Kyushu University in Fukuoka, Japan have concocted a “liquid-OLED.”
They have detailed their innovation in the latest issue of Applied Physics Letters. The “liquid-OLED” is named as such on account of its use of a liquid semiconductor layer. This latest technology could yield more pliant and reliable roll-up OLEDs compared to other technologies currently undergoing the rigors of testing and fine-tuning in other part of the world.
According to some resent research by The NPD Group, many of the people that are buying netbooks don’t know that they’re doing so.
Sure, the two names may sound familiar (notebook and netbook), but a whopping 60 percent of the people that purchased a netbook expected the same functionality as a notebook. Needless to say, the confusion has led to some irritation.
“We need to make sure consumers are buying a PC intended for what they plan to do with it,” stated Stephen Baker, vice president of industry analysis at NPD. “There is a serious risk of cannibalization in the notebook market that could cause a real threat to netbooks' success. Retailers and manufacturers can't put too much emphasis on PC-like capabilities and general features that could convince consumers that a netbook is a replacement for a notebook. Instead, they should be marketing mobility, portability, and the need for a companion PC to ensure consumers know what they are buying and are more satisfied with their purchases.”
Just in case you were worried that Intel wasn’t committed to it’s heavily delayed Larrabee platform, a 12 million dollar investment in a new Visual Computing Institute should help convince you otherwise. Located at Saarland University in Saarbrücken, Germany, this is the largest joint project ever formed between Intel and a European university. The institute will help Intel explore advanced graphical computing technologies, which includes everything from more realistic gaming, to advanced 3D user interfaces.
The primary focus of the research will be applied to Intel’s terascalling program. This will help them better understand how they can apply Larrabees unique multi x86 core architecture to achieve sustainable performance increases over modern day GPU’s. Larrabee has been delayed until some unknown date in 2010, presumably because it hasn’t yet achieved the type of performance gains they were hoping for against Nvidia & AMD.
In addition to terascalling research, Intel will also work with other hardware design labs in Barcelona, Spain, and Braunschweig, Germany to help optimize the Larrabee design. Z-buffering, clipping, and even ray tracing are all promises made by the Larrabee team, but clearly the software needed to make all this happen still requires some work.
Want more details? Click here to watch the press video.
So is Larrabee really the future? Or does this only prove Nvidia’s case that its promise is overhyped?