TEAM-ADA Archives

Team Ada: Ada Programming Language Advocacy


Options: Use Forum View

Use Monospaced Font
Show Text Part by Default
Show All Mail Headers

Message: [<< First] [< Prev] [Next >] [Last >>]
Topic: [<< First] [< Prev] [Next >] [Last >>]
Author: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
Richard Stuckey <[log in to unmask]>
Reply To:
Richard Stuckey <[log in to unmask]>
Wed, 9 Jun 1999 17:45:34 +0100
text/plain (1632 bytes) , richard.vcf (641 bytes)
Gene Ouye said:

> Use of Ada does not preclude use of an iterative, incremental
lifecycle.  On the contrary, I'd claim that it would
> be easier to use Ada in that kind of lifecycle than C...

This is indeed borne out by my experience. I find it is easier to
develop a quick prototype, or do incremental development, in Ada than in
C, for exactly the same reasons that it is easier to do "waterfall"
development in Ada: the language prevents me making so many errors!

For example, imagine you declare an enumeration type for a set of
alternatives (e.g. user commands, kinds of entity in a data model, etc.)
which the program has to handle. If you later decide to add some new
items, and extend the enumeration type accordingly, the Ada compiler
will inform you of all the places (case statements, record variants,
etc.) where you must add corresponding new code and declarations.  I
don't think a C/C++ compiler will be as helpful if you add more integers
to an enum type.  If I change the signature of a subprogram, the
compiler will find all calls that are incorrect - whereas in C I have to
make sure that my header files are in sync (and use make...), or use
lint to find the mismatches.

I find that using Ada, I need to remember far less about the details of
a program than I do when using C - because the Ada code is closer to my
abstract mental model, i.e. specification, of the program, and so the
"gap" in translating from the abstract to the concrete code is smaller.

"Strong typing is for people with weak memories" - well, the less
demands the language places upon my fallible human memory, the better!
Let the computer remember the details - that's what it's for!

      Richard Stuckey