The beauty of a Live CD is that it gives you a chance to access your computer or a batch of alternate applications without actually having to load up your operating system. You only need to pop the CD into your optical drive and boot it up from your BIOS -- this self-contained environment runs independent of anything that's located on your drive partitions, even though you can still perform a variety of tasks that manipulate the data on your drives.
For example, you can test our new Linux distributions using a Live CD, saving you the time and hassle of blanking an entire partition just to see if it's the right distribution for you. You can also manipulate the partitions of your drives using a Live CD, expanding and creating volumes to create alternate locations for new operating systems, files, or whatever it is you'd use a separate volume for. Live CDs are great for troubleshooting your system (or saving your data) when your primary operating system won't boot, and they can also be used to break through Windows installations that you've lost the password for.
All that functionality... and you don't even have to install a single program on your machine! Click the link to check out some of the best Live CDs that you should have sitting on your desk.
Following its rapid release schedule, eager Ubuntu fans need only wait until April 23rd for the next release of the open-source Linux distro. In the meantime, if a little over a month is just too long to wait, you can take a sneak peek at Ubuntu 9.04, Jaunty Jackalope, currently in alpha form.
The just released Jaunty Jackalope Alpha 6 is the fifth alpha release of Ubuntu 9.04 and includes several new features, along with a handful of known bugs. Among the former is a new X.Org server, version 1.6, better font-size optimization tailored to your monitor rather than defaulting to 96 dpi, new style for notifications and notification preferences, a new Linux kernel (2.6.28-8.26), and support for the new ext4 file system.
Keep in mind that as an alpha release, you should expect instability. Known issues include the disabling of the "encrypted home directory" option, video driver problems with the XServer, mis-reporting of proper font sizes resulting in abnormally small or large fonts, CTRL-ALT-Backspace is disabled, and users of Intel's i846 or i865 video chipsets receive an error message stating "Fatal server error: Couldn't bind memory for BO front buffer."
Here's one you don't see every day, and have probably never seen before: A man with an embedded USB drive in his prosthetic finger.
After being involved in a motorcycle accident last May, Jerry Jalava was half a finger short of having all five digits on his left hand. On the advice of his doctor, who learned that Jalava was "a hacker," Jalava opted to have a USB drive attached to the fingertip of his prosthetic finger, instantly earning himself several hundred geek cred points. And if that weren't enough, Jalava earns a geek merit badge for carrying around a Billix Linux distro and the Freddy Got Fingered movie on his USB key.
On his blog, Jalava clarified that the prosthetic finger is removable, allowing him to detach and "just leave my finger inside the slot" until he's finished.
With the prevalence of software available for many distros, why would anyone want to compile software from source? Compiling allows you to custom-fit a program to your particular hardware configuration and CPU architecture, which is useful if a program has no binary that is compatible with your processor. However, this is seldom a problem these days, since most computers now use 32 or 64-bit x86 processors. In the past, Linux enthusiasts often compiled programs from source to wring the greatest possible performance out of their hardware. More recently, this has mostly become a non-issue due to the increases made in computing speed; while compiling may offer a slight performance increase, it is not enough to really make a difference.
Although the introduction of package management on most distros, less diversity in CPU architecture among the user base, and massive increases in hardware speed have largely reduced or eliminated the need to compile software yourself, there are still a few instances where you would have to do so. Although the various official and unofficial software repositories for Ubuntu and other distros include most of the tools that the average user would need for any given purpose, the repositories are not completely comprehensive.Old packages sometimes get dropped and updated versions are often slow to be added. It may also take a release cycle or more for brand-new programs to be included.
While Ubuntu and Debian have “backports” repositories that have fairly new packages in them, many other distros do not have such a resource. For large projects with large community support, the developer may offer nightly builds, but this is not the case for most projects. The only reliable way to get bleeding-edge software (stability issues aside) is to either find a repository that has it or download the source code from the developer and build it yourself.
In part one of our guide, we walked you through the process of finding a distro that is right for you. By now, you hopefully have become more familiar with the distros that are out there and have at least one that you would like to try. This chapter is going to walk you through downloading and burning a CD image of your chosen distro(s), the traditional way of partitioning and setting up a dual-boot system, and another way to dual-boot without repartitioning. Instead of providing a step-by-step tutorial for a specific installation process, our goal is to educate you on the underlying concepts in a more generalized way that you will be able to apply towards many different Linux distros. You should also read our previous guide to installing Ubuntu for further instructions.
Microsoft recently slapped TomTom with a patent infringement suit. The Redmond-based tech behemoth has claimed that TomTom’s devices are in direct violation of eight of its patents.
Some fear Microsoft’s suit against TomTom may be a straw in the wind, as three of the claims are related to the use of the Linux kernel. Microsoft’s lawyer Horacio Gutierrez tried to dispel such misgivings. He told Cnet that the claims pertaining to the implementation of “file management techniques used in the Linux kernel” are only specific to TomTom.
He insisted that Microsoft is not going to mount a massive legal assault against the open-source community. Jim Zemlin, the Linux Foundation’s executive director, also feels that it is unfair to jump to conclusions about the scope of this lawsuit. Gutierrez and Zemlin certainly don’t think that Microsoft’s suit against TomTom is an indicant of trouble for the open-source community. What do you think?
We are certain that many of you want to try Linux to see what it is like, but have no idea where to start or how to get into it. This article is the first installment in a four-part guide that will gradually introduce you to the Linux environment and how to adjust to it if you are a new user.
One of the hardest things to do while starting out is finding a distro that is right for you. Many users try several before settling on one of two that they really like. Once they find a distro that feels right, they are often reluctant to switch unless the distro becomes unsuitable for their needs for whatever reason.
In most instances, choosing a distro ultimately comes down to several factors: Your skill level, the purpose of the system, and package management.
If Marvell has its way, plug computers will soon become commonplace. The company today announced its Plug Computing initiative, which seeks to make always-on computing not only more flexible and easy-to-use than it is today, but also more environmentally friendly compared to a typical desktop or laptop PC.
A plug computer is essentially a small embedded computer that plugs into a wall socket and hooks into your home network via an Ethernet cable. It can then run network-based services that would typically be handled by a desktop or laptop. Marvell's SheevaPlug platform, for example, comes equipped with a Kirkwood embedded processor based on an embedded 1.2GHz Sheeva CPU, 512MB of flash memory, and 512MB of DDR2 memory.
Cuba has debuted a new national Linux-based operating system dubbed "Nova." As one might expect, Cuba claims that the move will help the country replace proprietary Microsoft software running on the nation's computers. It almost sounds a little silly, but Cuba makes two noteworthy points as to why it's trying to purge this United States-based software from its networks. Nor is this the first international body that's sought to replace Microsoft software with an open-source alternative.
According to Cuban officials, the switch is more intended to turn away from United States-backed software as opposed to specifically Microsoft. They claim that governmental agencies would be able to infiltrate Cuban systems because they would could to pressure Microsoft to give up its "codes." It's unclear whether Cuba expects U.S. officials to actually hack into Cuban databases, break through encryption measures, or any combination of nefarious activities. Cuban officials also suggest that importing Microsoft software violates the U.S. trade embargo, an explanation for why Microsoft operating systems are allegedly more difficult to acquire for the island nation.
Grab your cigar and click the link to find out just how much Linux adoption Cuba expects to have within five years!
Cnet's Matt Asay reports that Microsoft has decided to set up an interoperability alliance with Red Hat. In enterprise computing, virtualization is the name of the game, and virtualization is what this alliance is all about. Whether you're running Red Hat Enterprise virtualization technologies, Microsoft's Windows Server 2008 Hyper-V or Microsoft Hyper-V server, the interoperability agreement will enable Red Hat or Microsoft guest operating systems to run on any of these virtualization platforms and get technical support. For details, see the Red Hat website or the Microsoft TechNet blog announcement.
It will take time for Red Hat and Microsoft to validate server platforms for interoperability, and valid software support contracts are required. The best news for those of us who support enterprise-level virtualized platforms on Red Hat or Microsoft? No more finger-pointing, so you can spend your evenings winning your favorite frag-fest instead of playing pass-the-buck with operating system support staffs.