Categories
General

Coders at Work

The coders who are interviewed in “Coders at Work” all have interesting opinions, but it’s the recurring themes which have really grabbed me.

The first theme is that all of these people are humans. They’re might all be famous for doing X, but hardly any of them set out consciously to do X, and none of them did what they did because they knew it’d lead them to where they are today. When you read about how they got into their area, the recurring themes are serendipity and “I did it because it was fun”. As Simon Peyton Jones says explicitly, the important thing is just getting started – because once you get started there’s a million interesting things you could play with. I don’t want to downplay the cool stuff some of these guys have done, but it’s hugely enlightening to hear them talk in their own words and hear how “normal” they all are.

The other theme I noticed is that everyone lives in their own little niche. Very few people in the book seem to have a broad overview of computing and how it’s changed over the decades. In particular, you can see how people’s thinking is constrained by either the era in which they learned about computers, or by the particular area that they’ve specialized in. It’s refreshing to hear Simon Peyton Jones say that he doesn’t really have a deep understanding of OO programming, because he’s basically not done that much of it – he doesn’t knock it, either. It’s weird to hear Peter Deustch describe his dream language without him knowing that these ideas have already been tried in Haskell. It’s interesting to hear people who are famous as being ‘lisp guys’ or ‘smalltalk guys’ knocking ‘their own’ language. And it’s amazing to see the split between low-level and high-level thinkers. I’m biased, because I’m into programming languages, but few people commented on the extent to which your Preferred Language affects your modes of thinking – although the results are plain to see.

Finally, this book made me realise that I’ve been in this game for quite a long time now (I’m only 32!). Enough time for entire chapters of knowledge to have come and gone. Programming in assembler, gone! Well not totally; still useful for compiler backends, security exploits and such like. Manual memory allocation, gone! Well not totally, there’s still kernel development and embedded stuff. Segmented memory models, gone but back for a wee while with PAE. Implementing primitive datastructures, largely done for you! C++, gone (for me at least)! I spend so much time getting really good at it too, hmmph. I respect it for what it is, but there are much nicer ways to spend your life.

But that’s all fine. Technology reflects the era that it was born in. C made sense when memory was expensive and CPUs slow. Now virtual memory and VM’s make sense. When resources were sparse, conceptual clarify was sacrificed to gain performance. Now we usually don’t need to make that sacrifice. The abstractions which made sense for a 1990’s desktop GUI app aren’t the ones you need for a 2009 network-based distributed system.

Is history important? Only partly, I think. The high level lessons are certainly important, but the details aren’t. Do you need to be able to code up a red/black tree today? No. But I think a developer should have a deep appreciation for the distinction between interface and implementation – and you should understand how the implementation choices can affect you (as a user). Do you need to understand low level hardware/assembler? No, but the concepts and solutions which crop up at that level crop up elsewhere too, so it’s certainly not wasted knowledge. Do you need to learn smalltalk? Only really to learn the ‘lessons of smalltalk’ – to see what you can do with a reflective, late-bound, heavily interactive system.

All in all, I’m whimsical about the amount of technology water that’s passed under my particular bridge. Easy come, easy go, I am not the sum of my knowledge. I’m happy to keep absorbing new fun stuff as times change and mostly I’m quite happy to see the back of the declining technologies anyway! It’s comforting in a way to see that there’s no real wizards out there, there’s just people hacking away on stuff that they think is cool and being ready to recognize the insights and the epiphanies when they come by. Evolution (and some marketing $$$s) usually takes care of picking out those solutions which are suited to the present environment. And there’s always the interesting “superior but ignored” technologies hovering around in the wings.

Computing has only really been around for one lifetime. Most of the first-generation guys are still alive. It’s interesting to hear some of them reflecting on a life spend involved in this area. I guess I’m taking a moment to reflect on where I am.

“Coders at Work”: loving it

2 replies on “Coders at Work”

Cool – I’m waiting for the library to get their copy in, but it sounds better than O’Reilly’s ‘Beutiful Code’ which I found pretty disappointing.

C++ and even occasionally assembler are still alive here, alas. But I think I’m about to see a fair few parallel computing technologies come and pass under the bridge in rapid succession.

Mostly asking this here rather than later at work because I will probably forget otherwise…

Given your interest in language, have you read the similarly interview-based “Masterminds of Programming” (http://amzn.com/0596515170)?

I was going to buy “Coders”, but got distracted by the existence of “Masterminds”, then wondered about buying that and trading lends with you.

Comments are closed.