Hard wired, soft coded, confused

Two phrases often heard around the computing halls are the phrases "hard wired" and "soft coded". If something in an application is hard coded, it is bad - or so goes the consensus. Hard coding installation directories is bad, hard coding the IP address of the server is bad, and so on.

In the opposite corner, basking in the diffuse light of constant praise, is the concept of soft coding. If something in an application is soft coded, it is good - or so goes the consensus. Soft coding installation directories is good, soft coding the IP address of the server is good, and so on.

Like so many opposites, in computing as in life in general, I don't think it is that simple. A number of things bother me about. Firstly, where does soft coding stop and hard coding start? How does one determine the line? Approached from a high level, it at first appears that soft coding has a natural end point. You find all the things in your application that should be configurable and you soft code those. The rest, is, um, hard coded.

Okay, but what are the configurable bits of you application? Directory locations, server names...fine, but what about look and feel of user interface screens? What about middle tier business logic? What about calculation algorithms? The line between hard and soft cannot be found mechanically it seems to me. It is a judgement call. It is not possible to look at something and say "that is hard coded and therefore bad" or "that is soft coded and therefore good". The broader context decides the goodness or badness of where the line has been drawn.

Another concern I have relates to how our notions of what to hard code and what to soft code are, to some extent, accidents of computing history rather than discovered truths. For example, we hard code memory usage for the most part. We rely on the host operating system to give us the illusion that our hard-coded concept of lots of free and flat memory is true. It isn't. The operating system steps in and soft codes our hard coding for us. Whether we like it or not.

Virtual memory is but one of a growing list of areas where an applications notion of hard coding is turned into soft coding without the application having any say in the matter. We have routers that can pretend that any address is any other address. We have operating systems that can turn root directories into branch directories. We have windowing systems that can make entire desktops turn into mere windows.

What we have now is an odd mix. A tension between hard coding and soft coding, combined with increasingly clever ways of turning hard coding into soft coding without help or blessing from the application itself. For me, that begs the question. Is it really sensible to try to soft code everywhere when, whether you like it or not, the host environment is going to make its own hard/soft configuration changes?

An alternative world, not this one for sure, works this way:

- Developers create applications based on a set of wired names for files, servers etc. They don't bother soft coding configuration items but they do publish what the wired items are.

- Applications are installed into environments that knows how to map all the wired items.

A number of things appeal to me about this alternative world. Firstly, soft coding is hard to get right. It is really hard to predict in advance, what aspects of your application will really benefit from configuration and which will not.

Secondly, it smacks of futility to try to soft code in a world where the host operating systems will do its own level of soft coding on top of yours. Either undoing or changing what you had intended.

Thirdly, I cannot help but wonder if application development and deployment would be a lot easier in a world where application and operating systems are not pulling the wool over each other's eyes as to what is really going on at configuration time.

Fourthly and lastly, I cannot help wondering if computing power is reaching a complexity inflection point. It happened in mainframe era and it is happening again now. Virtualization, virtual appliances...

How long do you think before the standard way to distribute an application is as a complete "virtual image" to be loaded into higher level containers that map all its drives and network connections and what have you? How long before the soft coding that developers slave over in their applications starts to gather dust as host-level configuration becomes more and more powerful. Will the world be a less confusing place as a result of this shift?

Probably not in the short term but I have hopes for the longer term.

Insider: How the basic tech behind the Internet works
Join the discussion
Be the first to comment on this article. Our Commenting Policies