Andy Hunt: Secrets of a Rock Star Programmer (Part 1)

Page 2 of 6

Ed: Yes, that’s true. But I’ve heard you say in your “Refactoring Your Wetware” talk that “multitasking kills.” So this is not multitasking, it’s putting everything else aside and doing the meta-cognition stuff.

Andy: Yeah, and it’s being deliberate. That’s another consistent focus through a lot of the stuff I talk about -- the idea of being deliberate about something, not letting it just happen accidentally or as a by-product. But taking deliberate steps to say, “Yes, this is what I’m doing right now and here’s why.”

Ed: Being deliberate is certainly an attribute of a successful developer. What are some other attributes of successful developers?

Andy: My stock answer may be different from what most people think. The biggest thing I look for is language arts skills. I would rather have an English major learn to program than a math major or an engineer learn to program.

I think that’s a somewhat contrarian stance, but when you look at it, the two things that we as developers do the most are communication and learning. And by communication, it doesn’t necessarily mean sending e-mails or writing white papers or that sort of stuff. Programming is an act of communicating to the computer, getting requirements -- we’re communicating with the end user, with other people. Working in a team to develop the software, we’re communicating with the other members of the team, with the team as a whole, all these sorts of things.

But we [software developers] are really the communication hub between various other humans and various business technologies, and that’s a big thing of what we do. I would much rather have someone who was

more trained in the communication arts. Writing a program, to me, is much more like writing an essay or a novel than, say, a mathematical theorem. [A program’s] logical flow, progression of ideas, all these sorts of things that come over from the language side are much more important than being a hard math geek.

Ed: Well, this fits very well into the next question. How would you break down the technical skills one needs to possess as a software developer? The first one on that list would be communication skills?

Andy: Communication would be big. I think curiosity is [also] a huge one. Curiosity is probably the biggest thing. That’s really what drives us. You know, as you said before, you’re getting to know the project, and all of a sudden you realize you don’t know about something. You know, for most developers, it’s not the schedule, it’s not the attributes of the project itself that drives them; it’s the curiosity.

Persistence is a key trait that goes along with that. You can’t give up on the first “page not found” or if the first article you find, you don’t understand.

Most of the best developers I know are both very curious and very persistent. So they track it down. Now, all of that has to be over a bed of a deep understanding of the fundamentals. You know, I don’t particularly care if somebody knows Java or Lisp or C# or Ruby. These things, you can pick up once you learn more than a few languages. The big deal is to learn how the operating system works internally.

Ed: Looking into the future a bit, new layers of abstraction are constantly being added on top of what has come before. How will future programmers deal with the necessity to understand the whole stack, all the way down to the metal?

Andy: I suspect what will happen is [that] the lower levels will begin to fall off. Even now, thinking in assembly language or working at that level is becoming more and more rare. I remember there was a time when even just endian-ness and differences between processors was a huge issue. This was a big deal. People spent a lot of ink, a lot of network protocols, and this and that to try and get around it, and now it’s like, “Well, hell, everyone uses an Intel.” It hasn’t gone away, but it’s certainly lessened.

There’ll always be a need to go down to the MOSFET [hardware] level and the gates and the K-maps and whatnot and chip design and lower levels, but fewer and fewer folks will do that and, unfortunately, I suspect fewer and fewer folks will even be aware of it.

Ed: Regarding [the] skills one needs to have, did you have any courses in college that you continue to use today?

Andy: Yeah, it’s kind of tough. Thinking back, I remember a couple of courses on finite automata, where it was bits of theory that would make you think, “Okay, this really is foundational.” You know, “This is stuff you canleverage on.”

That was much more valuable than learning the OSI protocol stack, which ended up being [something] nobody used. You know, TCP won the day. Technical education tends to overemphasize transient technologies. Whether that’s the current programming language or even the current programming style. You know, procedural versus object-oriented, versus functional, versus declarative, versus whatever. You should learn one ofeach of those kinds of languages in school so that you’ve got a good basis going forward.

A lot of people coming up don’t know what functional languages are, like Erlang or Haskell; they never tried declarative programming in PROLOG or something like that. Their idea of object-oriented is Java, which is unfortunate because that’s really not an object-oriented language. I really want to see the real solid base-level fundamentals. When you get out of school, four or five years after your freshman year, the technology is all gonna change anyway. We’re not doing what we did five years ago. We’re not gonna be doing this five years from now. In that time span, it really doesn’t make sense to get too hot and heavy into the flavor of the day.

Ed: Well, there’s an unfortunate tension between what you and I know is a sound approach to longevity and what hiring managers who are faced with thousands of resumes are dealing with.

Andy: Oh, absolutely. My recommendation would be spend the first three, three and a half years of your college career mastering the fundamentals and basics and putting [yourself] in the position so that you can learn the technology du jour very quickly. Then you spend the last semester saying, “Okay, this is what you guys are gonna be doing when you get out there.”

Ed: Can you say anything else about fundamental skills?

Andy: First, learn the fundamentals of assembly language, real basic stuff. Meeting most developers -- I can tell whether they grew up hacking inside of a operating system and writing assembly code or if they went to college and were taught Java and don’t really know anything lower than that. It really shows. The folks who have the grasp of the fundamentals are different without doubt. When something weird happens or something crashes or something new comes along, it’s less of a big deal. They can roll with it more. And for the folks who came in somewhat late to the game and have been, I think, robbed of a proper education -- to them, how it all works is kind of magic.

Ed: But to you, it’s not magic. Is that because you are old enough to have grown up along with the computing and software industry itself?

Andy: My first computer was an Ohio Scientific [OSI Model C4P], and I was 12 years old [when I got it]. It did make a difference, because in those early days…my first several computers really didn’t have an operating system. They had a firmware monitor and everything else was do-it-yourself.

I remember when the first TRS-80s came out with their TRS DOS. I was leery of the fact that when you saved a file, it would pick what sectors on disk to put it on. You didn’t specify that by hand. I was, like, “What the hell is this nonsense? I don’t want this thing guessing for me and fragmenting my file and jamming it up all over!” I was perfectly happy with do-it-yourself file system maintenance and allocating sectors and dealing with the fragmentation by hand.

And, similarly, in the days of CP/M, where you had to do your own memory management, you had to use overlays to swap out portions of your program because there wasn’t enough main memory to fit. That’s a really interesting exercise in architecture and design. I hate to sound like an old fart, but kids today really have no awareness of that.

Ed: That’s the thing. It’s not their fault; it’s just a product of the time. We had this benefit that this was the only way you could do it. There was no other choice. We had to go up through the bootstraps.

Andy: I agree wholeheartedly. I think that the folks who came out of that era where you started really close to bare metal are better suited to understanding the larger order of things.

Ed: So then maybe the takeaway would be [that] if you were a curious person, it wouldn’t sit well with you to not know how it worked all the way down to the bottom. Even if you have no interest in learning an assembly language, you have to know there is a benefit to doing so. So maybe the moral is study computer history?

Andy: No. There’s no future in history, somebody once told me. What you want to do is play. I do see the curious young developers now, they’re running [GNU/]Linux in several flavors at home, and they’re building their own media centers and screwing around with very low-level protocols and cables and soldering stuff together and writing device drivers. That’s really what you want to do. That gives you the most modern equivalent to the sorts of experiences that the rest of us [older folk] had growing up, just getting in there and hacking, because that teaches you a fair bit, and you start getting an appreciation for -- by the time you end up in a high-level language and ability on top of an operating system, you’ve got a good understanding of all the bits and pieces and how this all hangs together.

Ed: From the past to the future. What are some attributes of the language and environment that people will be using to develop software ten years from now?

Andy: Oh, I love this question. Okay, so here’s my thought on that, and I have no idea if this is actually going to work this way or not, but it strikes me that there is a real aspect of the cobbler’s children having no shoes here.

If you look at any of our popular computer languages today, with very few exceptions, you could render any popular program today onto paper tape or punch card. It’s a limited character set. It’s a limited line length for the most part. It’s black and white. It’s two-dimensional. It’s text. Gutenberg could print virtually any computer program in use. Any computer language program in use today could be printed on a Gutenberg press. It’s that basic, that simple, and it occurs to me in a lot of environments, given the richness of what we’re trying to express, that that’s a pretty poor model. You look at where, say, the gaming community is at, even simple things -- the use of color, but even going into three dimensions, use of spatial cueing, any kind of a richer environment, it strikes me that there’s an awful lot of opportunity there for a far richer expression of programming constructs.

And I’m not talking necessarily just about graphical programming or boxes and lines and that kinda stuff, but something more along the lines of interacting in, say, Second Life or some very rich virtual environment like that for a couple of reasons. Writing programs in black and white text seems pretty limiting, bandwidth-wise. That’s a waste of bandwidth.

Peter Coad had a book out a few years back on UML modeling in color (Java Modeling in Color with UML, Prentice Hall PTR, 1999), which I thought had quite a lot of merit to it. He basically had a color-code scheme for different archetypes of classes. I thought that was an interesting approach.

Looking at a model real quickly, you could easily discern yet another facet of things that wasn’t shown by the class diagram box style or the font. For the most part, that really didn’t seem to take off very well. It didn’t really capture the imagination of the population. I think that’s a shame. We’re back to syntax highlighting for convenience in IDEs (Integrated Development Environments), but it’s not part of the language proper. You can’t make a variable static by making it red or what have you, and it occurs to me that we’re really missing an opportunity for much richer expression there.

So you couple that with the idea that the folks just entering the workplace now are much more imbued with this idea of gaming and first-person shooters and virtual reality and all these sorts of things that us fuddy-duddies are far less comfortable with. Even if we do it, it’s not bred into our DNA. And the groups that are growing up with that now, I think they will end up making a large impact on the very notion of what we consider a computer program to be.

So I would say -- I don’t know about 10 years from now but maybe 20 years from now, they’ll look back on these syntax-colored IDEs with curly braces in them and snicker the same way we look back at paper tape

rolls and say, “My goodness. How primitive! Those poor people. How on earth did they survive?”

Ed: Ha!

Andy: I don’t know the exact form it would take, but I can well see it looking more like Second Life and less like the Gutenberg print.

Ed: One pragmatic approach that has emerged recently, in part due to new ideas entering the workforce, but also due to advances in processor hardware, is the use of desktop virtualization products such as VMware and Parallels. Do you see virtualization playing a larger role in future software development practice?

Related:
| 1 2 3 4 5 6 Page 2
ITWorld DealPost: The best in tech deals and discounts.
Shop Tech Products at Amazon