The discussion about the NRC report has prompted me to reconsider a puzzle
that I have pondered from time to time:
Why hasn't Ada become more popular than it is?
I've come to suspect that part of the reason is certain "perversities"
common among programmers. Of course, this conclusion is based only on
inadequate data, anecdotal evidence, and expert judgement, so no one should
take it too seriously.
About fifteen years ago, I had a disturbing experience while working on a
software project. I had just written procedures to implement some new
capability, though I no longer recall what that capability was, or even
roughly how "big" it was. I did some testing and found that the new
procedures appeared to be working flawlessly (must not have been very big).
When I saw this, I felt a twinge of disappointment that I would not have
anything to debug, immediately followed by revulsion at the thought that I
might have been hoping for bugs. The experience taught me something
important about myself, and I'm sure it is true for many programmers. I
enjoy the intellectual challenge of debugging. There is something about
the process that is very rewarding. Uncovering the details of what went
wrong, narrowing the scope of the problem, gathering evidence and
formulating hypotheses, and finally isolating the flaw are all activities
that provide small gratifications throughout the process. I suspect that
most programmers are among those people who enjoy mental puzzles of various
sorts, and debugging provides the same kind of rewards (as long as one is
able to make discernible progress).
On the other hand, it seems that few programmers enjoy having a compiler
tell them that they've blundered. As a result, I suspect that many
programmers find Ada compilers annoying. They would rather discover their
"bugs" themselves. Permissive languages let them operate in this mode to a
greater degree. I am not claiming that the use of Ada will reliably result
in flawless executables, but Ada compilers will reduce the number of errors
that are left undetected until run time.
Related to these issues is a tendency toward what I call "empirical"
programming: figuring out what will work by experimentation, rather than by
reading the manuals and working out the logical consequences. Of course,
this approach is often necessary due to inadequate specifications (for the
language, for the OS, for peripherals, for "legacy" subsystems, or
whatever). But even where adequately precise specifications are available,
many programmers are more inclined to "try something else" than to dig
through the manuals to try to find the real solution to a problem. While
this sort of experimentation can be a learning experience, programmers too
often settle for the first thing that works, rather than going back to the
manuals to understand why it worked. As a result, a hack that works in a
few cases that were tested becomes a flaw that results in subsequent
problems.
What has this got to do with Ada? It makes the precision of the language
and the LRM of little interest to many programmers. Also, uninformed
experimentation in Ada typically yields the annoyance of compilation
errors.
Another unfortunate trait of many programmers may be that they take pride
in mastering the arcana of programming languages, so the more arcana, the
better. Ada programmers may take pride in grasping the subtleties of
tasking, or the full power of generics, but Ada has relatively little of
the more superficial mysteries of a terse (dare I say unreadable) notation.
Such surface arcana make it easier to quickly set oneself apart from the
uninitiated, while there will still be many deeper mysteries (or
"pitfalls") as well.
The language Perl has attained considerable popularity. I regard it as an
effective language for small applications, but an abomination in language
design. According to the "camel" book (Programming Perl), "the three
principle virtues of a programmer are Laziness, Impatience, and Hubris." I
assume this is meant to be a bit of humorous hyperbole, but it seems to me
a dangerous attitude to encourage. I will admit that laziness may prompt a
programmer to seek ways to automate work, that impatience may prompt a
concern for efficiency, and that hubris may encourage a programmer to share
resulting software with others (often for a fee, of course). But if
laziness and impatience lead to sloppy practices, and hubris leads to a
stubborn refusal to take the advice of others (including compilers and
programming guidelines), it can lead to the sort of unreliable and
unmaintainable software that is so common.
Another "perverse" characteristic of many people is that they do not like
to be told what to do. A certain rebelliousness is no doubt behind much
resistance to the DoD Ada policy (or "mandate", if you prefer).
By this time, I have no doubt painted a bleak picture of programmers. I
have exaggerated. My attitude is not really so negative, but these
tendencies seem clear to me. Fortunately, many programmers have other
traits that counter these inclinations. I have always thought that
meticulousness, precision, and discipline are among the primary virtues of
a programmer, and these can go a long way toward balancing laziness and
impatience.
Finally, I have three disclaimers:
1. If any of the above is someone's opinion, it is only my opinion, as far
as I know.
2. The above ramblings are the product of a fevered mind (as I'm at home with
the flu today).
3. I have been careful to speak only of programmers. None of the above is
meant to apply to software engineers.
- Jim Hassett
|