TEAM-ADA Archives

Team Ada: Ada Programming Language Advocacy

TEAM-ADA@LISTSERV.ACM.ORG

Options: Use Forum View

Use Monospaced Font
Show Text Part by Default
Show All Mail Headers

Message: [<< First] [< Prev] [Next >] [Last >>]
Topic: [<< First] [< Prev] [Next >] [Last >>]
Author: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
Subject:
From:
Samuel Mize <[log in to unmask]>
Reply To:
Samuel Mize <[log in to unmask]>
Date:
Wed, 9 Jun 1999 17:14:47 -0500
Content-Type:
text/plain
Parts/Attachments:
text/plain (261 lines)
Greetings,

Roger, I applaud your willingness to ask hard questions and consider
the answer, and I appreciate your work refining the question you
really need considered.

I hope you won't feel I'm slamming on you in the following, but
it seems best to speak clearly and bluntly.

In your analysis I see three major errors:

1. You are still confusing language with method.

2. You are missing Ada's advantages for prototyping.

3. You are strongly mis-stating the probability and cost of different
   kinds of errors.

I'll go into more detail.

1. CONFUSING LANGUAGE WITH METHOD

The difference between Ada and C in no way requires a disparity like
65% versus 20% to get to detailed design.  You are still comparing a
heavily front-loaded, waterfall approach with an incremental,
prototype-driven approach.  I'm sorry, but you can do either in Ada.

The C project's "detailed design" was a set of viewgraphs and a
rapidly-coded, loosely-designed prototype.  You can easily build such
a prototype in Ada.  You've already identified the major techniques:
use predefined types, don't modularize into packages much.

With either language, the next step would be to take the the
prototype code and reorganize it into a final design, then proceed
with final coding.  (Or just finish off the prototype.  This is not a
snide comment.  For something like aircraft avionics, that would be
phenomenally stupid, but for a one-delivery commercial system it
might be just the ticket.)

Either approach is easy in Ada.

If you don't clean up the design, you don't get Ada's
maintainability/reliability advantages, but you lose very little.
You still get a lot of error checking.  This can be turned off in
the final system, but it will typically reduce your test time a
lot more than it increases your coding time.

If you do clean up the design before going on to coding (or during
coding), Ada is a clear winner here: with a running prototype as a
baseline, you can incrementally introduce typing and modularity.  Cut
apart the "subsystems" (sets of related packages), get their design
well defined (package specs and types), re-integrate them.  You may
want to do this one, or a few, subsystems at a time, re-integrating
before cutting out more subsystems.  Once you've partitioned your
design properly, you can pretty much just finish out the code.

Finally, note that prototypes for requirements and early design
should seldom be hand-coded anyway.  You should use GUI builder, or
something like MATLAB for functional prototypes.  If you're manually
building a bunch of code to show to the user and throw away, you're
working too hard for too little return.

If you really are coding things up to verify your design approach, I
find that this is much easier to do meaningfully in Ada.  It may take
20% longer to build the prototype, but you know a heck of a lot more
about your design and its potential problems when you're done.

2. ADA'S ADVANTAGES FOR PROTOTYPING

Ada is an excellent language for rapid coding.  However, it does
require a different mind-set.

The typical C prototype approach is "code now, design later maybe."
(I'd apologize for the negative tone, except it's exactly what you
gave for the C project's explicit approach.)

The work product is an undesigned, kludged-together program that
works in a limited set of cases.  You have no real idea what will be
needed to extend it for all the cases the final system must handle.

Rapid prototyping in Ada uses the mind-set "design just a little,
code just enough."  It's easy to lay in a very rough, preliminary
design as a set of packages, types (mostly subtypes of predefined
types), subprograms and tasks.  You can then code up just the parts
you want to prototype, stubbing out the rest.

The Ada prototype takes a little longer, but not a lot longer.  After
all, you aren't laying in the entire system design before coding
anything -- avoiding that is the whole idea.  Assuming you're equally
versed in C and Ada, I'd expect coding the prototype to take maybe
10-20% longer in Ada -- and coding is just a small part of a
prototype develop/evaluate cycle.

And your work product really gives you a lot of information about the
design approach it embodies.  I find that significant design problems
-- the kind of thing that can make you start over -- surface a lot
earlier in Ada.  And, you aren't as tempted to "paper over" a problem
with a kludge that will grow as the system grows.

My experience in Ada is that you can build up some very useful
support packages for prototyping and preliminary design.  Even a
simple TBD package will give you a lot of help.  Consider:

  package TBD is
    procedure Statement (S: String);
  ...

This is used like:

  case User_Input is
    when Forward =>
      Movement.Forward;
    when Backward =>
      TBD.Statement ("move backward");
    when others =>
      TBD.Statement ("not handled by prototype");

Depending on what you are doing, the body of TBD.Statement can print
out the string, or just raise an exception.  Later, you can remove
TBD from your library, and the compiler will tell you if you've left
any TBD items in your code.

Finally, most Ada compilers these days provide tools that obviate the
use (and development and maintenance) of "make" scripts.  For
example, GNAT provides gnatmake, which will rapidly and certainly do
all the compiles you need to get an up-to-date executable, and only
the compiles you need.  The tool that does this for you when using C
is an intern, if you're real lucky.

3. THE PROBABILITY AND COST OF DIFFERENT KINDS OF ERRORS

You said:
> My use of the word
> "develop" was in reference only to design and code.

This is the real Achilles' heel of the argument.  "End of coding" is
a meaningless milestone for comparing costs.  Let's ignore maintenance.
To get to the first delivery, you must design, code, TEST and DELIVER.

(But before I entirely discard maintenance, I will mention that it is
MUCH easier and cheaper to handle that first wave of trouble reports
if you have a coherent, modular design, and if it's well expressed in
the code.)

> The argument is completely with the -risk of cost growth-.

First, let's pick apart your detailed worst-case:

> For example, to get to a detailed design review, let's say 65% of the work
> is done in the Ada case, but only 20% in the C case.

Here you're swapping methods again.  If you're doing waterfall-style
development, you don't even start coding until after preliminary
design -- if then -- so language choice is not relevant.

> In the simplest
> case, where a design flaw is found requiring starting completely over,

In this case, the management team and top-level technical staff
should be summarily fired.

I honestly can't imagine a case where something got to a detailed
design review, and then required scrapping the entire work product.
Even if there's a major conceptual problem, big chunks of the
requirements analysis, design and code should be reusable.  And in
stripping out an existing analysis and design and rebuilding it, I
would far, FAR rather start with something that was made in a modular
fashion, or that at least would let me partition it during the
rebuild.

Granted, I have no experience with a program that was this screwed
up, but I DO have experience upgrading systems well beyond their
original design, ripping out some things and adding new elements.
Ada makes this much, MUCH easier.

So the real numbers for your worst case might be that the Ada program
spent 25% instead of 20% of its budget, and will spend 80-90% of the
original budget instead of 90-100% to start over.

But of course, Ada shines more brightly if you don't foul up from the
get-go.

Now, let's look realistically at the -risk of cost growth-.

> However, for the C case, my risk
> of cost increase is very low, because there are many ways to use metrics to
> estimate the number of errors that will be put in during coding.

This is false, in the public domain.  (That is, not patently.  :-)

For the typical project, it may be fairly accurate.  But for the
typical project, we don't scrap everything through detailed design
and start over.  Let's compare either typical or worst-case scenarios.

1. For normal development, your C project's risk of cost increase is
higher, because you will tend to discover problems later.  That's the
whole point to strong typing and run-time checks: you don't find out,
late in testing, that you've missed something.  Sometimes you find
out that you left out a parameter, but sometimes you find out that
you missed a significant point that affects design, or even your
understanding of the requirements.  (Prototyping reduces this risk,
equally in either language, but it doesn't eliminate it.)

It's documented -- and just common sense -- that the later you find a
design or requirements error, the more it costs to fix.  You have
more work to redo, and if it affects other parts of the system, you
have to redo them too.

Not only that, errors found in late coding or testing have the worst
impact on final schedule.  If we scrap the design, we may be able to
work hard and smart and deliver close to schedule.  If we find a
major problem two weeks before delivery, how likely are we to be able
to rework the whole system in time?

Since Ada surfaces many conceptual errors earlier -- often during
design-time compilations, even in a strict waterfall development --
you are reducing your risk of cost growth by using it.

And, if you used Ada's facilities for modularity, you're likely to
get off requiring fewer changes to less code.

2. For a worst-case scenario, let's look at a problem found in late
testing that turns out to have ramifications throughout the program.
Changes are going to run through the code like cancer, and you have
almost no time to do it.

In the first place, as I said, this is FAR less likely with Ada.

In the second place, you're more likely to have modularity "firewalls"
isolating the problem, making its solution faster and easier.

In the third place, such changes are much easier to make when the
compiler is telling you what you've just made obsolete with each change.

Historically, this is much, MUCH more likely to happen, to the extent
that a delivery delay announced just before the expected delivery date
are hardly remarked on in parts of the commercial software world.
And, this kind of problem can cause a huge cost increase.

>With the
> Ada estimate, the risk of cost increase is higher, due to the extra cost
> associated with the design work associated with a well-designed Ada
> program.  This cost is related to higher-level design problems.

Again, here you're assuming that only the C project is prototyping
and communicating effectively with the customer.  If the other
project is charging forward with tunnel vision, it will have this
risk, whether or not it's using Ada.

The extra effort up front in a WELL RUN Ada program is insurance AGAINST
cost increases, later in the project.

I hope you find this of interest, and worth considering.

Best,
Sam Mize

--
Samuel Mize -- [log in to unmask] (home email) -- Team Ada
Fight Spam: see http://www.cauce.org/ \\\ Smert Spamonam

ATOM RSS1 RSS2