There is nothing new under the sun.

Well, the rant on TechCrunch has gone global: Tech’s Dark Secret, It’s All About Age.

Excuse me while I throw in my two cents, as a 44 year old software developer.

  1. Pretty much all of the useful stuff in Computer Science was invented by the 1960’s or 1970’s. Very little is out there today that is really “new”: MacOS X, for example, is based on Unix–whose underpinnings can be traced back to 1969, with most of the core concepts in place by the early 1980’s.

    Even things like design patterns and APIs and object oriented programming stem from the 70’s and 80’s. Sure, the syntax and calling conventions may have changed over the years, but the principles have stayed the same.

    For example, take the MVC model first discussed formally 20 years ago. The ideas behind that weren’t “invented” then; the MVC papers from Talgent simply codify common practices that were evolving in the industry well before then. One can find traces of the ideas of separating business logic from presentation logic in things like Curses, or in Xerox Parc’s work from the 1970’s. I remember writing (in LISP) calendrical software for Xerox as a summer intern at Caltech, using the principles of MVC (though not quite called that back then) in 1983.

    Or even take the idea of the view model itself. The idea of a view as a rectangular region in display space represented by an object which has a draw, resize, move, mouse click handler and keyboard focus handler events can be found in InterLisp, in NextStep, in Microsoft Windows, on the Macintosh in PowerPoint; hell, I even wrote a C++ wrapper for MacOS 6 called “YAAF” which held the same concepts. The specific names of the specific method calls have changed over the years, but generally there is a draw method (doDraw, -drawRect:, paint, paintComponent, or the like), a mouse down/move/up handler, a resize handler (or message sent on resize), and the like.

    The idea never changes; only the implementation.

    Or hell, the Java JVM itself is not new: from P-machines running a virtual machine interpreter running Pascal to the D machine interpreter running InterLisp, virtual machine interpreters running a virtual machine has been around longer than I’ve been on this Earth. Hell, Zork ran on a Virtual Machine interpreter.

  2. I suspect one reason why you don’t see a lot of older folks in the computer industry is because of self-selection. Staying in an industry populated by Nihilists who have to reinvent everything every five years or so (do we really need Google Go?) means that you have to be constantly learning. For some people, the addiction to learning something new is very rewarding. For others, it’s stressful and leads to burnout.

    Especially for those who are smart enough to constantly question why we have to be reinventing everything every five years, but who don’t like the constant stress of it–I can see deciding to punt it all and getting into a job where the barbarians aren’t constantly burning the structures to the ground just because they can.

    I know for a fact that I don’t see a lot of resumes for people in their 40’s and 50’s. I’m more inclined to hire someone in their 40’s as a developer than someone in their 20’s, simply because you pay less per year of experience for someone who is older. (Where I work, there is perhaps an 80% or 90% premium for someone with 4 or 5 times the experience–a great value.)

    But I also know quite a few very smart, bright people who decided they just couldn’t take the merry-go-round another time–and went off to get their MBA so they could step off and into a more lucrative career off the mental treadmill.

    I have to wonder, as well, where I would be if I had children. Would I have been able to devote as much time reading about the latest and greatest trends in Java development or Objective C or the like, if I had a couple of rug-rats running around requiring full-time care? (I probably would have, simply because I’d rather, on the whole, read a book on some new technology than read the morning paper. I would have probably sacrificed my reading on history and politics for time with my children.)

  3. There is also this persistent myth that older people have familial obligations and are less likely to want to work the extra hours “needed to get the job done.” They’re less likely to want to pull the all-nighters needed to get something out the door.

    But in my experience, I have yet to see development death marches with constant overnighters paid off in pizza that didn’t come about because of mismanagement. I don’t know another industry in the world where mis-managing the resource sizing, and demanding your workers work overtime to compensate for this failure to do proper managerial resource sizing and advanced development planning is seen as a “virtue.”

    And I suspect the older you get, the less likely you are to put up with the bullshit.

    Having seen plenty of product make it to market–and plenty not make it to market, and having lived through several all nighters and product death marches, I can see a common theme: either a product’s sizing requirements were mismanaged, or (far more commonly) upper management is incapable of counting days backwards from a ship date and properly assessing what can be done.

    The project I’m on, for example, was given nearly a year to complete. And Product Management pissed away 7 of those months trying to figure out what needs to be done.

    The younger you are, the less likely you are to understand that three months is not forever, and if you need to have something in customer hands by December, you have to have it in QA’s hands by September or October–which means you have to have different modules done by July. It’s easy if you don’t have the experience to understand how quickly July becomes December to simply piss away the time.

    So I can’t say that it’s a matter of older people not being willing to do what it takes–if upper management also was willing to do what it takes, projects would be properly sized and properly planned. No, it’s more a matter of “younger people don’t have the experience to do proper long-term planning to hit deadlines without working overtime,” combined with “younger people don’t have the experience to call ‘bullshit’.”

  4. There is also, as an aside, a persistent myth that it takes a certain type of intelligence or a certain level of intelligence to be successful in the software industry.

    I’m inclined to believe more in the 10,000 hour rule: if you practice something for 10,000 hours, you will become successful at that thing.

    Intelligence and personality could very well help you gain that 10,000 hours: the first few hours of learning how to write software or learning a new API or a new interface can be quite annoying and stressful. But if you persist, you will get good at it.

    Which means IQ and personality, while perhaps providing a leg up, doesn’t guarantee success.

    It’s why I’m inclined also to want to favor more experienced and older developers who have persisted with their craft. If we assume a 6 hours of actual development work (with the other 2 on administrative stuff), then a work year only has 1,500 hours–meaning 10,000 hours takes about 7 years to accumulate. Assuming you start out of college at 21, this means that anyone under the age of 28 will not have sufficient experience to be good at their craft.

    And that assumes they practiced their craft rather than just going through the motions.

The whole “it’s all about ageism” in the tech industry is an interesting meme–simply because it’s far more complicated than that.

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s