Another birthday for the CD has come and gone and yet the damn things just won’t die. On Aug. 17th, 1982, the Compact Disc was born into an age of rampant consumerism that was the 1980s. Big hair was in, big vinyl and the big snarls of tape from cassettes was out.
The CD of course wasn’t without its drawbacks. They disliked abuse and absolutely had to live in their cases. I replaced the bulk of my CDs with my first car player before going back to cassette so I could dub my own playlists and stop spending money on music I had already bought. Even the players then were delicate. My car CD player touted a 3 second anti-skip buffer for those canyons in the road called potholes. Of course back in east Texas they had washboard roads that could eat up that buffer and just ruin AC/DC’s Who Made Who.
It would be well over a decade later before semi affordable CD-Rs would arrive. Since then everything from music, to photos, to video, to Grandma’s recipes have been stored on CD.
Some really great things came on CD like, Windows 98, Bruce Springsteen’s Born in the U.S.A., and Starsiege: Tribes. I invested a big chunk of my life in Tribes.
Of course the CD is also responsible for bringing us some really bad things too. Remember Windows Millennium Edition? They should have melted down those CDs before they left the factory. Then there was Mary Schneider’s Yodeling the Classics. That thing should be classified as a method of torture.
Remember when AOL used to spew out those CDs to pimp their dialup service? I use to use them as coasters for my coffee cup. When a new one came into the office, (about every few days it seemed) I’d toss my old AOL coaster and put down the new one.
What do you think some of the best and worse things that have ever been put to CD are?
As we covered previously, Dell was trying to trademark the term ‘cloud computer’ and had filed the necessary paperwork with the U.S. Patent and Trademark Office. The application had reached the Notice of Allowance phase where they receive a written notification that a mark has survived the opposition period and that other parties have had a chance to object to the application.
In a ruling posted on the trademark office’s web site on Aug. 12th, they rejected Dell’s application to trademark the term cloud computing, backing away from a final official recognition of Dell’s application.
The trademark office’s findings said, "In addition to being merely descriptive, the applied-for mark appears to be generic in connection with the identified services and, therefore, incapable of functioning as a source-identifier for applicant’s services”. This leaves everyone in the IT field saying, “Well, duh!”
Each year, we ask, "Was this the best year ever for games?" A good deal of the time, our answer tends toward "Yes," with a few nostalgia-maniacs vehemently worshipping 1998 instead. "Oh, they're just raving fanboys," I've always thought of those stuck in '98. "Their opinions are rooted in so much misguided subjectivism that even a bulldozer couldn't budge them."
However, a recent post at the always-interesting Sexy Videogameland gave me some insight into another, altogether more-acceptable reason for gamers' unyielding grip on the past. The post, by Leigh Alexander, of course, took a look at our tendency to play a game once, shove it into a nice, dusty shelf corner, and leave it there with no hope of excavation. Why do we do this? Especially when, as Leigh pointed out, many of us were happy to bury months of our lives in a single game back in the day.
But the answer's simple, really: You're reading this column.
As a bleeding-edge gamer, when you're not playing a game, you're probably reading about other games -- basking in the ever-brightening glow of a new title's hype -- and getting yourself psyched to play them. This column, with its daily dose of the latest gaming news, only helps propagate this trend.
Really though, does it matter? As Leigh pointed out, our consumer-focused society breeds hit-driven industries. Movies, TV, sports -- you name it. "15 seconds of fame" is an apt phrase. So we're just like other media. Big deal. But I think it does matter. I think games, by virtue of their interactivity, are meant to break the typical, rapid-fire hype cycle. And that's why so many gamers love 1998. The year was chock-full of top-notch titles, but gamers still spent hundreds of hours with their favorites -- testing boundaries and pushing limits. Why? The hype train as we know it hadn't quite picked up steam. Print was still strong and the Internet wasn't the all-knowing force that it is today.
And therein lies the problem. As the gaming industry grows -- as the press expands and the hype train takes on new carts -- it defies its own potential. Someday, games will shrug off the shackles of linearity, but will gamers stick around to experience those trailblazers in different ways? Or will our own anticipation for The Next Big Thing get the best of us?
Today's Roundup details a couple of initiatives that could grab at gamers' ankles and never let go, but will they work? Can't say. But for now, my commentary will have to suffice. It's all past the break.
Confused by terms like SATA II, SATA Gen 2, and SATA 3Gb/s? You're not alone. With today's release (link in PDF format) of the PHY (physical layer) portion of the forthcoming SATA revision 3.0 specification (details here), SATA-IO, the trade association responsible for defining Serial ATA specifications, is trying hard to stomp out the many misidentifications of SATA specifications and features over the years.
SATA revision 3.0 doubles the speed of the current 3Gb/s version, reaching transfer speeds of 6Gb/s. So, what should you call the newest member of the SATA specifications family? According to the SATA Naming Guidelines, here's what works:
The first reference in a document should be: "Serial ATA International Organization: Serial ATA Revision 3.0." Additional references can be to either "SATA Revision 3.0" or "SATA 6Gb/s."
To find out how SATA-IO is also working to clear up confusion for current technologies, join us after the jump.
What does Debian, one of the most popular and stable Linux operating systems, and myself have in common? We both celebrated a birthday on August 16th! But unlike myself, Debian has proved its maturity at 'only' age 15 and probably doesn't find fart jokes funny anymore. Debian's also been highly influential, as many of the popular GNU/Linux distributions you've read about or played with - including Ubuntu and Knoppix - are based on Debian..
To trace Debian's roots, you'd have to go back to 1993 when Ian Murdock, who is now VP of developer and and community marketing at Sun, first announced the OS. But why call it Debian? Because of a girl, of course! Ian combined the name of his then girlfriend (and now wife), Debora, with his own (Deb+Ian), the union of which gave birth to Debian.
All versions of Debian are named after characters from the film Toy Story
There are always four versions
Least stable version of Debian is named after Sid, the emotionally unstable neighbor kid in Toy Story who enjoyed destroying toys
It won't be long before single-core processors will seem as antiquated as single-speed CD-ROM drives, and the case could be made that we're already there. Dual- and quad-core processors rule the landscape, and while Intel's upcoming Core i7 has enthusiasts frothing at the mouth, the chip maker may have something even more mouth watering in the very near future.
If the latest rumor turns out to be true, expect a replacement architecture for Nehalem in 2010 which will double the number of cores per die to eight. Codenamed Sandy Bridge, alleged leaked slides suggest the new architecture will also support hyperthreading, giving the eight-core chip a generous 16 threads to work with. Also look for 16MB of L3 cache to find its way onto the chip.
But for all the hardware goodness, it's the software that may end up playing the biggest role in performance improvements. Intel will reportedly introduce a new instruction set called Advanced Vector Extensions (AVX) that will eventually supersede SSE. AVX will double the size of instructions to 256 bits and will be capable of performing up to four calculations in a single instruction.
With over a year to go before the supposed new architecture makes a debut, will developers be ready by then to take advantage of the additional cores and new instruction set?
Four years is an eternity in the computer world, but it doesn't take a crystal ball to predict that Linux will continue making headway against Microsoft's close-source Windows OS. Between Vista needing gimmicks to convert the skeptics (Mojave), to increasingly user-friendly versions of Ubuntu, Microsoft may find itself in a grudge match with the open-source community by 2012. But what can we expect out of a Linux distro in 48 months? InformationWeek attempts to answer that question with a mix of bold predictions and some much needed feature enhancements. Let's take a look at some of the highlights.
Three Basic Usage Modes
Linux has traditionally been free for most users, but in-store boxed copies complete with a price tag have started popping up, and IW says this trend will "at least gain nominal momentum." Free to use variants won't be disappearing anytime soon, and IW sees free distributions that contain no components with patent encumbrances or other issues picking up steam.
While Linux hardware is already present in a plethora of devices, look for it to become a brand name four years down the road, pushed in large part by the continued popularity of the Netbook market.
Bye-Bye Command Line!
One of the biggest roadblocks preventing Linux from marching into the mainstream market is ease-of-use. The days of typing in commands died with DOS, but on a Linux distro, even some basic configurations might require the user to fire up the Terminal. Of course, there are legions of Linux-ites that prefer it this way, the same ones who not so affectionately refer to Ubuntu as Noobuntu.
Catch all the predictions here, then tell us your Linux predictions below!
Indilinx has completed the development of their Barefoot (IDX22) high-performing solid state drive controller with 90nm process technology which shows an impressive fastest read speed of 230MB/s and supports a capacity of up to 512GB with multi-level cell (MLC) NAND flash. Indilinx claims “phenomenal performance at a competitive price”.
Barefoot supports native SATA 2.0 interface and provides maximum read and write speed of 230MB/s, 170MB/s with SLC NAND flash, and 200MB/s, 160MB/s with MLC NAND, respectively. It uses Indilinx’s unique architecture and technology, including independently operating 4 channels and external DRAM buffer and it enhances stability and reliability by using two types of hardware error-correcting code (ECC).
Those improvements are coming by leaps and bounds in SSDs. It's not clear if this will be competing with Intel’s controller directly. No mention if this is targeted at portable or stationary (or both) PC market.
Talk about a generational leap forward. The SSD revolution has barely begun, but while others are busy focusing on incremental capacity bumps nowhere near the size of the largest HDD, BitMicro says it can now make SSDs with a ginormous 6.5TB capacity.
According to TG Daily, the company made the claim at the Siggraph trade show held at the Los Angeles Convention Center. The rep went on to say that the custom drives can have up to 55,000 input/output instructions per second (IOPS) with sustained (not burst) transfers of up to 230 MB/s. In other words, not only would this wonder drive thoroughly trounce today's SSDs in terms of capacity, but it would be faster too.
The drives would also be physically bigger, with the loose-lipped rep saying the custom SSDs would be about two to three times higher than a regular drive.
Anyone think we'll see 1TB SSDs before long, let alone 6.5TB models?
Electronista says that Intel is planning a super fast 160GB Solid State Drive. They report that Intel's flash memory marketing head Troy Winslow says the Z-P140 was just a prelude to a series of bigger announcements to come before the end of the year. Winslow goes on to say that there will be a series of 1.8- and 2.5-inch drives for ultraportables that will hold between 80GB and 160GB. They should also outperform the 100 megabytes per second reading speed of the Samsung Flash SSD.
Solid State is really making inroads this year and as may predicted, they have invaded laptops first. Perhaps by 2010 they will be the default choice for enthusiast builds in desktops. Their read speeds are faster than traditional hard drives and their capacities have reached a useful level. Winslow points out that Intel’s experience with building quick interconnects between processors and chipsets helps them make improved memory controllers for the SSD drives.
When they hit the same price point and capacity as Western Digital’s Velociraptor drives and they do something about the pokey write speeds, count me in. They aren’t quite ready for mainstream, but they certainly do look like the future.