With the prevalence of software available for many distros, why would anyone want to compile software from source? Compiling allows you to custom-fit a program to your particular hardware configuration and CPU architecture, which is useful if a program has no binary that is compatible with your processor. However, this is seldom a problem these days, since most computers now use 32 or 64-bit x86 processors. In the past, Linux enthusiasts often compiled programs from source to wring the greatest possible performance out of their hardware. More recently, this has mostly become a non-issue due to the increases made in computing speed; while compiling may offer a slight performance increase, it is not enough to really make a difference.
Although the introduction of package management on most distros, less diversity in CPU architecture among the user base, and massive increases in hardware speed have largely reduced or eliminated the need to compile software yourself, there are still a few instances where you would have to do so. Although the various official and unofficial software repositories for Ubuntu and other distros include most of the tools that the average user would need for any given purpose, the repositories are not completely comprehensive. Old packages sometimes get dropped and updated versions are often slow to be added. It may also take a release cycle or more for brand-new programs to be included.
While Ubuntu and Debian have “backports” repositories that have fairly new packages in them, many other distros do not have such a resource. For large projects with large community support, the developer may offer nightly builds, but this is not the case for most projects. The only reliable way to get bleeding-edge software (stability issues aside) is to either find a repository that has it or download the source code from the developer and build it yourself.
If you want to be a programmer at some point, you are going to need to know how to compile applications, since that is really the only way to develop your own projects and effectively contribute your own code to open source projects that other people have started.
It is also wise to compile security-oriented software (like encryption tools) yourself. Although binaries are generally trustworthy if they came from an official repository or developer website, you can never really be 100% sure of what you're getting unless you build it yourself. (preferably after a code audit, if you have sufficient skill to do so)
Alternatively, you may find a program you are interested in, but it is not packaged for your distribution (e.g. RPM packages on a DEB-based distro) and there is no native package available for you to use. While there are tools (like alien) that can convert packages from one type to another, the program may not always work correctly after it has been converted.
In such situations, your only real option is to build from source.
Although compiling software from source can solve some of your problems, it can also create new ones. Compiling and installing software from source effectively bypasses your package management system. This means that you must personally do the work that your package management system would otherwise do, such as keeping track of installed software, satisfying dependencies, and even preventing conflicts between different programs. This last situation is where the most can go wrong.
A decent package manager is aware of the specific version of the dependencies that a program needs to run and is able to cross-reference that data with the needs of other programs. This is especially important when updating; if a new version of a program includes dependencies that would break other installed programs, the package management tool should postpone installing the new version until the other programs that would be affected can be safely updated as well. When you compile from source, you are forcing your changes through safeguards meant to help protect your system's software integrity. This can be very dangerous if you have not taken the time to understand and estimate the repercussions this action may have.
Compiling software from source is a useful concept to understand, (or even essential, depending on your aspirations) but it should always be considered a measure of last resort. On most modern distros, there are much better ways to install software, and these should always be used first whenever possible.
This guide will tell you how to compile programs from source on Linux. We will not cover the specifics of building a kernel, but we will teach you how to build individual programs on any distro. Likewise, this guide will not address specialized tools (like emerge) that are found on source-based distros like Gentoo. This guide is intended for fairly advanced users instead of those new to Linux. Some knowledge of the terminal is required due to the way the compiling process works.
Before you compile your first program, you must prepare your compiler toolkit. Linux has many compilers and related tools available as part of the GNU Project; these include gcc, (the GNU C compiler) g++, (the GNU C++ compiler) make, (a tool to help automate the build process) and many others. You will probably need to install them yourself (check your distro's repositories) since few distros include them out of the box.
Fortunately, many distros include most of the compiler utilities in a single package so you won't have to install each one separately. (Ubuntu's is called “build-essential”) Unless you are familiar with the language the program is written in, you may not know what compilation tools you will need until you run the configuration script.
1. First, you must acquire the source code of the program you wish to compile. (this can be found at the developer's website or at an online resource like Sourceforge.) Source code usually comes in archive files called tarballs, identifiable by a tar.gz extension. You should save the tarball to a folder where you have write permission, such as your own home directory. The temp (/tmp) folder is not recommended due to reasons we will address later.
2. Open a terminal and navigate to the directory where you saved the source tarball.
3. Extract the tarball by typing “tar -xvf program.tar.gz”. (obviously, you should substitute program.tar.gz with the real filename) Most of the time, a tarball will create a new folder (this guide will refer to it as the build directory) during the extraction process and will place its contents in there.
4. Navigate to the build directory.
5. The build directory may have many files and subfolders in it, but the first thing you should look for is a file called “configure”. For a moment, you should compare the software compilation process to cooking. Before you start making dinner, you need to know what goes into a recipe, (and the cooking instructions) or the dish will turn out wrong. The configuration script is like a grocery list; it makes sure that you have everything you need before the compilation process begins.
6. Run the configuration script by typing “./configure”. (do not omit the dot and forward slash at the beginning) The configuration script will run in the terminal and will check your system for the presence of the necessary compilers and dependencies. While it is running, the configuration script will provide output from the various tests it conducts. This output is your guide to fixing any problems you may encounter. If the configuration script finishes successfully, go to Step 8. Otherwise, go to the next step.
7. If the configuration script fails, it will usually tell you the reason why. The most common cause of failure is a missing or outdated dependency. A program may require a specific version of a dependency, and your only recourse is to download and compile the right version from source before continuing to compile the program that needs it.
Quite often, the situation can be recursive; a dependency may have other dependencies it relies on and those must be compiled as well. (and so on) This is why compiling can be so dangerous; it is possible for a new dependency (or a sub-dependency) you compiled to cause unforeseen conflicts with the version you already have installed. However, this does not always happen; it is possible and fairly common for different versions of various dependencies to exist side-by-side without any problems.
8. If the configuration script finishes successfully, it will generate a set of build instructions called a “makefile”. The makefiles (there may be one in each subdirectory in addition to the main one) give the precise instructions for compilation to the Make program. The Make program controls the compilers and tells them what to do. To start the compilation process, type “make”. If there are no makefiles, you will not be able to compile until the situation is rectified. Although you can run “make” as root, it generally isn't required.
The most common error message you can run into at this step is “make: *** No targets specified and no makefile found. Stop.” If this error happens, that means that the Make program has no makefile to use, and therefore has nothing to do. In this case, the first thing you should do is make sure that the configuration script finished successfully, since it would have generated the makefiles if it did.
The Make program will produce output as it processes each source code file. Don't worry if you don't understand the compiler output; at this point, all you have to do is wait for it to finish. Depending on the size of the program you are compiling and the speed of your computer, it may take anywhere from several seconds to a few minutes (or longer) to compile everything.
9. Once the Make program is finished, you will have to install the program. Switch to root. (type “su” and provide the root password) As root, type “make install” and the Make program will handle the installation for you. Root access is required because the make program will add files to the /usr/bin directory and other places that your own user account does not have permission to write to.
After the program is installed, you should be able to use it. You should leave the build directory in place after compilation because that will allow you to uninstall or reinstall the program later. (typing “make uninstall” in a program's build directory will remove the program if it is installed. Typing “make install” again will reinstall it.) If a program's build directory is removed, you will have to recreate it (by compiling the program again) to be able to manage any installed instances of the program with any sort of ease.
For this reason, you should not compile a program from the temp folder, since the build directory will be destroyed when the computer shuts down.
Although most programs follow the standard procedure listed above, some programs do not. These are often small programs or older ones. In many instances, they may not have a configuration file and come with a preconfigured makefile instead. You may have to analyze the makefile yourself with Vim or another text editor to see what the program needs to in order to work. It is impossible to address every potential deviation from the standard procedure in this guide, but we can give you some advice to help you get started in these difficult situations.
The key to figuring out a non-standard program is experimentation. The first thing you should do is run “make or “make install” to see what happens. Sometimes one (or both) of those commands is enough to get the program built and installed. If that doesn't work, you should check the developer's website or any readme files included in the project for further help.
Also, keep in mind that many programs are notoriously hard to compile, even for experts. (OpenOffice.org is a prime example) This experience can be very frustrating, but persistence is the only way to get through it.