Personally, I feel that I would be a better programmer and computer scientist today if I HADN'T gone into industry.
Personally, I feel the other way around. I wouldn't be where I am if I hadn't gone into the industry. Then again, I had the opportunity to work with great people on big projects.
I should say that it hasn't been all bad. During the past five years (omg, how time flies!!!), I've worked on probably five or six new projects and helped with the maintenance of probably about the same number for three different teams. The cool part was that each of these teams was a different size and had different types of problems. The first team was small and (surprisingly) I had a good boss when I first started working for them. The programmers sucked. The upside that I was given a green light to develop two new projects in my first year and a half (with minimal technical supervision -- hours and hours of stupid meetings though).
Later, I was transitioned to a larger team with "more difficult" problems (ie bigger maintenance messes to clean up -- gotta love Unix). The shocker for me was that the small team was actually better in many ways. The technical leadership of the larger team made brain dead dumb decisions like writing their own XML parser in C, used zillions of environment variables (literally, 50+) because they were to incompetent to parse a configuration file, etc. And to top it off, you would never be allowed to go in and replace retarded code with this group because (1) you faced all of the usual politics and pettiness (what do you mean the xml parser is broken... the other team shouldn't have changed their "xml file format" type of stupidity), and (2) you actually had quite a bit of government oversight which meant that any changes had to go through various reviews and must be explained and... basically, it would make the leads and managers look dumb.
After five years in industry, I can honestly say that I didn't learn one thing that I didn't know starting out (although I did teach myself Java Reflection, but I was already familiar with the concept of "meta-programming" from Lisp). Certainly, I did work more on types of projects that I wouldn't have done normally (web programming, apps in C, etc) and used tools that I wouldn't have used normally (eg JBuilder and other assorted relics), but I can honestly say that I haven't learned one thing in terms of computer science. I was pretty astonished at the general level of incompetence at even a Fortune 20 size company
. While some of the people that I've worked with in academia have been terrifyingly smart, none of the people in industry ever impressed me (nor my buddy Andrew who is starting a phd this year). Maybe I've just been lucky in that regard at USC and other people have found their professors or research labs to be less interesting.
My academic friends aren't worried about unit testing their work, but to me, there's no sense in researching languages or algorithms if, when they're implemented, aren't tested or degrade the quality of work.
The thing that surprised me about industry, especially with current "sexiness" of unit testing, is that NOBODY had written a single unit test in any of the groups that I worked. Plus, most of the people that I've talked to about it at other companies don't either (minus a couple of exceptions one and Sun and two at Google)!
Here is my unit testing hypothesis. I believe that people who come from better computer science departments (ie really study algorithms, language theory, compilers and have to write lots of hard code -- or paid the proverbial pound of flesh at some point) tend to gravitate toward writing automated unit tests
. They've had to use them and seen the light.
While those people that come from the "dark side" (ie software engineering) like to write documents... design docs, req docs, testing docs, maintenance docs, process development docs, on and on... they write more than Mark Twain ever did. However, they never write unit tests because that is "code" and unit testing is not regression testing or integration testing, etc. They just don't get it, and somehow, they've been brainwashed into believing that writing a "test plan" in Word is actually a useful thing for stamping out bugs.
I always write unit tests. I have to for my academic projects because that programming tends to be hard. I'd rather spend a few "easy" minutes writing unit tests than an hour in a debugger doing the wtf thing. At work, most of the code is relatively easy, but I write unit tests anyways as a form of proof that "my code" works. There have been a number of occasions where someone has made filed a bad bug report concerning my code and I've been able to refute it with existing unit tests. SCR rejected! =)
Though the one place, I would say, where academia and industry meet is finance.
Funny that you should mention that... I don't want to jinx anything, so I'm going to stay mum on the finance stuff for now.
Actually, I think that academia and industry meet in a number of places (eg research labs, startup companies, etc). The problem is that the vast majority of industry positions have changed from being the domain of a few highly accomplished, mathematically/technically inclined people, towards something more akin to the "army of programmers" engineering model, and unfortunately, academia has given in to this mentality. Everything seems to be dumbed to down to the level of middle management now. For example, one of the groups that "interviewed" me at Boeing had never even heard of Lisp. There were three managers (all former software guys) and four developers and NOT ONE OF THEM HAD HEARD OF LISP*!!!!! Not surprisingly, the program was $80 billion dollars over budget and cut early last year (FCS). Anyways...
* I didn't bring it up; They were reading my resume. At Boeing, your constantly being shuffled around from one small group to another and you "interview" for each one. I've probably spent 10 hours in "interviews" during the past two or three years and I don't think that I've been asked more than three technical questions -- total.
I believe that you'll always have ideas coming out of academia being implemented in industry -- IBM, Google and Akamai are three examples among many more where a new company basically formed around a new algorithm. As Gates has noted, most modern businesses don't have a long enough time horizon to really work on something akin to what the Wright brothers did. IMHO, academia fills that niche. You have startup companies springing up out of the big university labs -- Symbolics in the 80s, Sun in the 90s, iRobot in the 00s (among several other companies and industries). The established research labs at major companies continue to work on interesting new stuff. Academia and industry certainly have overlapping shared interests, but it really is the exception for most jobs. Of course, big companies eventually buy these smaller companies and kill the culture, but this is another story.
Though I work in healthcare, that's one of my personal challenges: how can I structure code so that it's functional, accountable, traceable and testable?
That's pretty cool. It sounds like you are still allowed to do the "right thing" as opposed to the "worse is better" thing that started with Unix. See my next post in a bit.