In compiling a list of the world's oldest software companies, one comes face to face with an inevitable question. Namely, what is it? What the heck is this thing we call "software?"
We searched the darkest corners of our brains and perused the online dictionaries for quickie text bytes and never really could come up with a single, all-purpose answer. Is it the overtly simplistic "Anything that is not hardware but is used with hardware" or the seemingly too limiting "The programs used to direct the operations of a computer?" How about this metaphysical beauty: "Unlike hardware, software can't be touched." Ouch. That makes our heads hurt.
While it's easy to say that Windows or Office or even the wanton dismemberment of Dead Space 2 are obvious examples of software, where does one draw the line? Did software, for instance, exist before the advent of computers? In our minds, it did. Though the concept of altering the performance of mechanisms by feeding them independent sets of instructions has clearly become rampant in the computer age, it in fact started long before that – the early 18th century, to be exact. And that is precisely where we'll start our journey.
Please remember as you read that software – and for that matter, computers – were with us long before the desktop PCs that so radically changed everything. Moreover, just because Joe Blow in some dungeon in Joe Blow Land cranked out a few lines of code before one of the key players, we've elected for the purposes of this article to ignore Joe and highlight instead those companies that history will see as having made a serious impact. Ergo, our countdown may seem a bit scattered. It isn't. It's perfect.
To accurately discuss one of the most important companies in the development of early computing, one needs to discuss its most important contribution, the punch card. And to accurately discuss the punch card, one must transport oneself all the way back to the beginning of the Industrial Revolution.
As any history buff will tell you, the invention of the textile loom in the early 17th century – which permitted the mass-manufacture of clothing – was a critical moment in the history of modern man. So when two French dudes, Basile Bouchon and Jean-Baptiste Falcon, added perforated paper rolls to the equation, thus allowing one person to output a variety of goods without enlisting assistants, it was a Very Big Deal that is today considered by many to be the birth of software.
In time, the punch card concept was used with a variety of machines. But it was a fellow by the name of Herman Hollerith (above, left) who pioneered the idea of recording data on a punch card that would then be read by a machine. The US government liked the idea enough to use it in its 1890 census, the first-ever census to be machine tabulated.
In 1911, Hollerith's burgeoning enterprise, Tabulating Machine Company, was one of four businesses that merged to form Computing Tabulating Recording Corporation (CTR). Though the new company's reign under the CTR moniker was brief – just 13 years – it was eminently notable not just for its punch cards but also for the fact that in 1924 it was renamed into something just slightly more familiar – IBM.
Think IBM and you may think hardware and mainframes. And rightfully so. But you need only look at its birth to see it flaunted a not insignificant software component right from the start.
As we documented earlier, it was a merger between four companies that formed Computer Tabulating Recording Company (CTR), the predecessor of IBM, exactly one hundred years ago. Of these four concerns, two were of the software ilk. Granted, we're talking primitive software, but software nonetheless.
First, and as we've already discussed, there was Herman Hollerith's Tabulating Machine Company. But we can't forget International Time Recording Company, a business founded in 1888 on the principle that a tardy employee was a bad employee. ITRC pioneered the notion of the employee time clock – and of course the associated "time cards" – a concept that would ultimately be a mainstay throughout industry for much of the following century.
IBM's done a ton of stuff since then and had its fingers in virtually any pies it could find as it built itself into one of the world's premier corporate goliaths. Yet software has always played a role.
Developed in the 1950s, FORTRAN has been at the FORefront of high-level programming languages ever since. COBOL, also developed in the rock and roll 50s, remains a standard language for business apps. And of course there's Lotus Development Corporation and its Lotus Software, acquired by IBM in a hostile takeover in the mid-90s. IBM. Big Blue. Big software.
Back in the day ("the day" meaning pre-Beatles, when the number of computers worldwide was just a few thousand) the term "software" wasn't even on the radar and the coding that ran the beastly machines was developed either by the company that built the computer – which at the time were each about the size of a school bus – or by the users themselves.
Two ex-IBM workers, Elmer Kubie and John Sheldon, had a plan to change all that. Veterans of the IBM Technical Computing Bureau in New York, Kubie and Sheldon saw a need for independent software providers and in 1955 formed a business they would call, inventively enough, Computer Usage Company (CUC).
In time, CUC would build programs that simulated the flow of oil (for the California Research Corporation) and tracked election results (for the CBS television network). It "had business" with the US Navy, and developed a compiler – a program that translates source code into object code – for the Federal Aviation Administration.
In 1967, as Hendrix performed at Monterey and two years before man walked on the moon, CUC sported a staff of 700 and revenues in excess of $13 million. Twenty years later, in the midst of a rapidly changing industry - and likely because of it - the company was Chapter 7. But not before staking claim to being the first independent computer software developer. Ever.
Any company that's survived more than 50 years must be doing something right.
In 1959, a bright young IBM programmer by the name of Roy Nutt (below, left), a key player in the development of IBM's groundbreaking FORTRAN programming language, banded together with marketing specialist Fletcher Jones (below, right) to found Computer Sciences Corporation. Legend says the total investment of both men was a mere $100.
Today, Nutt and Fletcher have sadly gone to the great mainframe in the sky, yet CSC annual revenues regularly hit the $15 billion mark and the company employs more than 91,000 people in 90 countries. In between, CSC pioneered early software development, defeated rival CUC (see above) at its own game, became the largest software concern in the United States in 1963, and has counted among its customers such trifling outfits as NASA, the federal government, and AT&T.
All of this from a business most people may not even know, and one that certainly hasn't produced any software at any time that's become a household name. Which proves you don't need to be a household name to do a good job and make one heck of a lot of money.
In May of 1962, President John F Kennedy predicted America would by the end of the decade place a man on the moon. Just a year and a half later, the world had lost one of its great leaders. Yet there was no shortage of folks ready to keep Kennedy's grand dream alive over the course of the next six years.
Certainly MSC Software was in there doing its share. Debuting in 1963 as MacNeal-Schwendler Corporation, the company specialized from the start in structural analysis, developing software for pre-PC computers that simulated the functionality of complex engineering designs. Its first product, SADSAM (Structural Analysis by Digital Simulation of Analog Methods), was designed specifically for the aerospace industry, and by 1965, MSC was involved heavily with NASA.
Today, MSC Software employs 1200 people over 23 countries and says it can count virtually every OEM manufacturer in the world as an MSC customer.
The early 1960s - 1963 in particular - was an instrumental time in the up-and-coming software world. We've already referenced several other companies that either began life or began to flourish during this era, decades before the concept of home computing had even been concocted, and we can now add another. Founded in San Francisco in 1963 and incorporated one year later, it was called Applied Data Systems Inc. (ADSI) and it lays claim to being "one of the oldest established, independent software companies in the world."
By 1967, ADSI was big news. Big enough that on June 21st of that year, it was featured on the front page of what would eventually become an industry media staple – Computerworld. There, in bold headlines, we see, "COBOL, RPG Bested By New Language?"
What the heck were they talking about? Well, it seems ADSI, under the leadership of one Peter Harris – who himself had turned down an earlier offer from fellow ex-IBM buddy Peter Nutt over at the newly formed Computer Sciences Corporation – had been slaving on a new programming language dubbed ADPAC. ADPAC, claimed ADSI, was far superior to IBM's much ballyhooed COBOL language (according to Harris, "The world thought COBOL was just terrible") and ADSI was out to prove it.
ADPAC was ultimately successful but not as successful as Harris felt it should have been – a situation he blames on IBM scooping all the juicy government contracts. Today, ADSI, now renamed ADPAC after its original programming language, continues to deliver mainframe solutions to those who need them.
Today, Cincom Systems proudly proclaims on its very own website that "The history of the software industry really follows the history of Cincom." And you know, they're not that far off the mark.
Founded in 1968, the same year we saw the television debuts of Hawaii 5-O and Rowan and Martin's Laugh-In, Cincom was at the outset unique from the rest of our Top Ten. You see, it did not make software. Nevertheless, we feature it here. Why? Because Cincom recognized, long before the vast majority of people both inside and outside the industry, that software would one day be an entirely separate entity. Ultimately, its contribution was arguably just as key to the maturation of the industry.
Selling software in 1968? At a time when the mere suggestion of computers was enough to make most of us shake our heads and verbally question the sanity of anyone even thinking such weirdness? Why, yes, that's exactly what Cincom did.
Granted, the road wasn't easy. With a $600 investment and an office that consisted of a card table in the home of co-founder Thomas Nies, Cincom ran rather lean at the start, promoting the soon to be trendy idea of database management and selling one such system, named "TOTAL." But the years were kind to Cincom and less than a decade later its offices were found in such faraway spots as Japan, Belgium, and Australia.
The only constant in the computer industry is, apparently, Tom Nies' facial expression.
Today, with Nies still running the ship, Cincom is a multinational giant. That its point man is featured in the Smithsonian Institute's Computer History Collection is the icing on the cake.
By strict chronological definition not one of the world's first software developers, Nintendo Company is nevertheless a positively ancient entity by any standards – beginning life as Japanese playing card manufacturer Nintendo Koppai way back in 1887 – and is certainly one of the first businesses to jump aboard and drive the arcade and home video gaming bandwagon.
Now worth somewhere in the neighborhood of 100 billion dollars (what's a few billion between friends?), owner of Major League Baseball's Seattle Mariners, and creator of such classic – some might say irritating – video gaming icons such as Mario and Donkey Kong, Nintendo was far less affluent and far less prestigious when it made the switch in the mid-70s from toys and other less successful ventures to electronic games.
Even then Nintendo's primary interest seemed to lie in entire game systems, and soon it had become the Japanese distributor of one of the very first home gaming consoles, the Magnavox Odyssey. The "Color TV Game" series of consoles followed soon thereafter, but by the early 80s Nintendo had begun to carve a historical niche by shipping all-time software favorites such as the aforementioned Kong and Mario on both home-brewed and third-party systems.
It can be argued that Nintendo's biggest claims to fame are its full-on systems, but the company also delivered plenty of software to go along with the hardware. In the end, Nintendo's two-pronged attack helped shape the home gaming industry.
Microsoft is by no means one of the world's first software companies. But we simply can't keep harping on about programming languages, mainframes, and punch cards when most people consider software to be the programs we as end users load on our personal computers. Thusly, we're compelled to discuss the Redmond, Washington giant simply because its founders, Bill Gates and Paul Allen, are generally acknowledged as being not just the first people to accurately forecast that software as opposed to hardware was the way of the future, but to also successfully act on it.
And act on it they did. Though there was no shortage of entrepreneurs who seemingly saw the writing on the wall and entered the PC software game in its formative stages (such as Digital Research's Gary Kildall, the creator of the wrongly short-lived CP/M operating system), Gates and company finagled and stepped on a few shoes (most notably Kindall's) and took it to the max, writing a programming language for the primitive yet seminal Altair 8000 in 1975, developing first the Xenix and then the MS-DOS operating systems, and then pumping out the WYSIWYG Microsoft Word in 1983 and of course the graphical extension of MS-DOS, Microsoft Windows 1.0, in 1985.
By 1990, Microsoft had cut its earlier ties with IBM, released the Microsoft Works office suite, and slammed the market with a Windows version that had staying power – Windows 3.0. The rest, as they say, is history – except for one thing. Say what you will about Gates' business practices – and many have – but dude is one heck of a philanthropist.
Like Microsoft, Steve Jobs' and Steve Wozniak's Apple Computer may have been a late arrival to the computer party. But it wasn't late where it counted – the PC revolution.
Indeed, Apple's first computer, 1976's Apple I, was notable not just because it was handmade by a teenaged Jobs in his bedroom, which it was, but also for its compatibility with – gasp! – keyboards and monitors. Scoff if you want, but this was a time when competing PCs, of which there were precious few, relied on toggle switches and blinking lights. Can you say "Star Trek?"
But back to the software. Though the Apple I didn't actually feature software inasmuch as firmware, all that would change as new units rolled out. By 1978 and as an upgrade to the Apple II, Jobs and Wozniak released both a disk drive and Apple's first operating system, Apple DOS. By 1980 and the business-oriented Apple III, we had Apple SOS (Sophisticated Operating System), which would then morph into Apple ProDOS three years later.
But it was the Apple "Lisa," foisted upon an unsuspecting public just one year prior to the first Mac and priced at a groan-inducing $9995, that heralded Apple's first foray into full-blown graphical operating systems. In it, we found file browsers, document icons, spreadsheets, drawing tools, and much, much more. Apple's graphical OS thusly predated Microsoft's by two years and that alone was one giant leap for mankind.