In our last white paper roundup, we explained the technology behind three modern connectors. And while stuff like USB 3.0 and Light Peak is pretty exciting, we can't help but feel like technologies that speed up physical connections are a little behind the times. After all, isn't the future supposed to be wireless?
In that spirit, our new batch of whitepapers explores the wild world of wireless technologies, including 4G, Near Field Communication, and 802.11ac Wi-Fi. So keep reading, and educate yourself about this generation's wireless tech.
Here at Maximum PC, we carry over some of boot's best traditions - among them the white paper, which explains key aspects of technology and advancements in the field, because understanding the inner workings of tech is what really separates the nerds from the normals, the hard-core from the hardly-informed, the PC master from the PC user.
We've done so many of them now that sometimes it's a struggle to find new technologies. That wasn't always the case though--in 1997 we wrote our first white paper about a topic that's as fundamental to computing as you can get: RAM. Read on to see what we wrote, and tell us what kind of tech you'd like to see a white paper on in the comments!
“After the researcher voluntarily removed these applications from Android Market, we decided, per the Android Market Terms of Service, to exercise our remote application removal feature on the remaining installed copies to complete the cleanup,” Android security lead Rich Cannings wrote in a blog post.
He then went on to tout remote deletion as an integral part of Google's response mechanism against malicious apps: “This remote removal functionality — along with Android’s unique Application Sandbox and Permissions model, Over-The-Air update system, centralized Market, developer registrations, user-submitted ratings, and application flagging — provides a powerful security advantage to help protect Android users in our open environment.”
It is conspicuous from the timing of this revelation that Google is trying to offset any harm that SMobile's claims may have done. But the Ohio-based security firm remains firm and is unlikely to do a volte-face.
You can’t swing a dead Na’vi without hitting a new 3D display product these days. Three-dimensional imaging was actually invented in the 1800s, and has been used sporadically in movies since the 1920s, but James Cameron’s sci-fi epic Avatar is bringing it into the mainstream.
Now that 3D is less of a gimmick, TV manufacturers are beginning to incorporate the technology into their products. Panasonic, Samsung, and Sony all announced new 3D TVs at CES this past January. And Avatar could be the best thing to happen to Nvidia and Zalman in their efforts to sell PC gamers on their respective videocards and 3D displays. Market research firm DisplaySearch projects that annual sales of 3D-ready monitors will grow from 40,000 units in 2009 to 10 million by 2018.
So, given that at least some early adopters will buy a 3D display in due time, it’s worth knowing how this visual trickery works. Knowledge is power in the world of upgrading.
Competing technologies may use different implementations, but all 3D video is based on stereoscopic imaging: An illusion of depth is created by presenting a slightly different image to each eye. Each image is of the same object or scene but from a faintly different perspective. Your brain then synthesizes the two images into a spatial representation. The most common 3D applications depend on the viewer wearing either active eyewear (e.g., liquid-crystal shutter glasses) or passive eyewear (e.g., linearly or circularly polarized 3D glasses).
The performance of an LCD monitor ultimately depends on how its liquid crystals are manipulated to channel light. We’ll examine the three most common technologies: Twisted Nematic (TN), In-plane Switching (IPS), and Vertical Alignment (VA).
Each of these three technologies creates a pixel using a cell of liquid-crystal molecules controlled by a thin-film transistor. Liquid crystals are used because they’re capable of effecting light as though they’re a solid, while exhibiting the malleability of a fluid. In a color LCD, each pixel is subdivided into three cells, or subpixels, which are colored red, green, and blue, respectively, by additional filters. These cells are arranged in a matrix of rows and columns sandwiched between two panes of glass, with a polarizing film on the exterior side of each pane.
A light source, such as a cold cathode fluorescent lamp or an LED grid, is placed behind the first glass panel. Light waves from the backlight follow the alignment of the liquid-crystal molecules, but they must pass through the two polarizing filters before reaching the surface of the display. Light waves must be oriented perfectly parallel to the first filter to pass, but since the second filter is oriented perpendicular to the first, no light will pass unless it’s reoriented first.
Today, we’re starting to see the first motherboards with USB 3.0 support. That support exists in the form of a discrete controller chip, typically the NEC uPD720200; it will likely be late 2010 or sometime in 2011 before we see USB 3.0 integrated into motherboard chipsets. Still, USB 3.0 is a major leap beyond USB 2.0, so peripheral manufacturers are already announcing products to support the new standard.
First, let’s clarify some terminology. USB 1.0/1.1 was typically just called USB, and supported throughput up to 12Mb/s. When USB 2.0 arrived, with its 480Mb/s speed, the USB Working Group (www.usb.org) needed a distinguishing name, hence Hi-Speed USB. USB 3.0 will be called SuperSpeed USB. Got that?
Batteries are everywhere. They’re in our phones, mice, cars, laptops, game machines, controllers, remotes, cameras—you name it. Battery technology influences the design, capabilities, and feature set of nearly everything portable, from laptops and cell phones to hybrid and electric vehicles.
Most of the batteries in our lives are rechargeable, and our more eco-aware world is quickly replacing standard alkaline AA and AAA batteries with rechargeable equivalents. Still, few people know how all these batteries work or how to best take care of them.
We’re going to focus on common rechargeable battery types, but before we get into that we should cover a few basics about how batteries work and go over common terms.
Though solid state drives have existed for years, it is only recently that they’ve gained any sort of market penetration for average users. As we stated in our February 2009 white paper on the subject, solid state drives offer many advantages over traditional magnetic drives. Unlike mechanical hard drives, SSDs have no moving parts, so they draw less power and produce no vibrations. They’re also more resistant to physical shock. And most importantly, solid state drives offer much higher read and write speeds than traditional hard drives—at least when they’re new. Due to their NAND flash architecture, SSDs can suffer serious slowdowns once they run out of fresh blocks to write to. The TRIM command, found in Windows 7 and newer releases of the Linux kernel, aims to fix this. But what is TRIM, and why is it even necessary?
Organic light-emitting diodes, or OLEDs, are often touted as the next big thing in display technology, offering brighter colors, true black, lower power consumption, and better off-axis viewing than traditional LCD screens. They’ve popped up in gadgets from high-concept to mundane: The infamous Optimus Maximus keyboard, for example, utilizes many tiny OLED screens in its programmable and customizable keycaps, and both Sony’s new X-series Walkman and Microsoft’s new Zune HD have OLED screens. OLED technology has made great strides in the past 10 years, and cheaper and better manufacturing processes mean they’ve started appearing in everything from media players to phones to high-definition televisions—even keyboards. But what are OLEDs?