TEAM-ADA Archives

Team Ada: Ada Programming Language Advocacy

TEAM-ADA@LISTSERV.ACM.ORG

Options: Use Classic View

Use Monospaced Font
Show Text Part by Default
Show All Mail Headers

Topic: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
Samuel Mize <[log in to unmask]>
Thu, 10 Jun 1999 18:22:44 -0500
text/plain (183 lines)
Greetings,

I wish I had a good set of papers and published case histories to
refer you to -- perhaps some of the consultants and authors on the
list can provide some.  All I can do is tell you my own experience
and observations.

> I have compressed this a bit to make it less long.

Good plan.  Blaise Pascal (I think) once wrote "sorry this letter
is so long, I didn't have time to make it shorter" (paraphrase).

I'm not going to respond to everything quote-by-quote, but I will
use quotes to introduce my points.

- - - - -

First, to answer a specific question:

> >So the real numbers for your worst case might be that the Ada program
> >spent 25% instead of 20% of its budget, and will spend 80-90% of the
> >original budget instead of 90-100% to start over.
>
> My number was 65%.  Where did you get 20%?

Project C took 20% of its budget to get to rework city.  Project A
will take 25% instead of 20%.

- - - - -

> >> In the simplest
> >> case, where a design flaw is found requiring starting completely over,
> >In this case, the management team and top-level technical staff
> >should be summarily fired.
>
> I can think of 2 major Ada projects where something like this happened (not
> firing the management and top-level technical staff).  I imagine others
> can, also.

Really -- where the entire requirements analysis, preliminary design,
first-increment detailed design and code had to be scrapped entirely?
Nothing was saved, nothing reused?  My, my.  I've seen things that
required significant replanning and rework, but nothing that flushed
the entire work product to date.

- - - - -

> This is simply assuming -standard practice- ... in both cases

If you consider large, requirements-driven, front-loaded, traditional
projects as the baseline for Ada, I get to pick Microsoft as the
baseline for C.  Now let's talk about risk, schedule slips, late
deliveries, crashing software and products that the users hate.  :-)

- - - - -

> -standard practice- ... in both cases (no prototyping).

You specifically said that Project C used prototypes to elicit
customer response and reduce requirements risk, and Project A did
not.  My argument is that you can do so with Ada too, and that this
is the key difference between Projects A and C.

Ada's facilities support a continuum of methods.  For highly
reliable, long-lived system, like missile avionics, it supports a
front-loaded life cycle.  But it can also be used fast and loose.
For an incremental development where user feedback will influence
design, you'd pick a middle road.

- - - - -

> to simply get a well-designed first iteration of a given
> set of functionality, it takes longer with Ada than with other languages.

You must compare well-designed first iterations, or not-designed
first iterations.  If you compare the well-designed to the
not-designed, the well-designed one will always take longer.

But really, nobody can make something work without design.  They
simply design as they code, and record their design decisions in the
code (or remember them).  This is both possible and easier in Ada.
You can use subtypes to indicate design intent, for instance, without
bringing in the overhead of strong typing.

At equivalent levels of design before coding, the Ada project will
probably use 10%-30% more time -- that is, if the C project took 10%
of schedule, the Ada one will take perhaps 11%-13% of schedule.

And the modularity and clarity that cost you that 10%-30% will speed
up development of later increments, unless every increment is totally
unrelated to the previous one.

- - - - -

> The 65% and 20% were made up to make the end costs end up very close, but
> they are not way off from what I heard back in the early 80s when I started
> learning the language.  And I have never seen any data stating the opposite.

Anybody got references or case studies handy?  I should HOPE we've
learned something in the last 15 years!

Incremental development became popular in and after the late 80s[1].
Any numbers from the early 1980s compared waterfall-model projects to
undesigned, un-incremental, chaotic projects.

There have been papers in ACM's Ada Letters and elsewhere about
incremental development in Ada, but I don't have a library handy.

I can't see how Project A spent 65% of the schedule before they
showed the customer a first increment, unless they did a formal
requirements analysis, a formal preliminary design, and a formal
detailed design prior to coding their first prototype.  That isn't a
language issue.

It may be a cultural issue.  I've done iterative development in Ada,
and it works great.

Ada's packages and modular constructs let you define what small part
of the system you are doing "today," and chop out, in an orderly
fashion, what you will do "tomorrow."  You can reflect upcoming
design issues, for instance with stub packages and procedures, and by
using Ada as a PDL (there's an IEEE standard for that).  Or, you can
just leave out whatever you don't want to code today.  Tomorrow the
compiler will help you find every place you need to integrate it.

Would that approach shock Ada developers in 1985?  Perhaps.  It's
commonplace today.

For example, NASA's Space Station Training Facility was built with an
incremental approach.  In fact, it's STILL being built that way, as
it must change when the Station does.  No other approach is possible.
It uses a more front-loaded approach than the language requires --
for example, there was a full requirements analysis phase before
anybody did any design or coding in any language -- but I'm sure it
didn't cost 65% to get to detailed design for the first increment.

I've also coded Ada rapidly, without using its high-level design
features.  It comes close to the speed of hacking in C, yet you get a
lot of help in catching errors in your rapidly-written code.  Plus,
you get higher-level constructs like

  for I in Inputs'Range

which obviates declaring I, keeping the max value of Inputs around,
and remembering to increment the counter.

(OK, if you just want to look at time to finish writing code, this
doesn't help.  If you are looking at time to finish making the code
WORK, Ada comes out a lot more even.)

> Compare Ada text books with C books.
> ...
> Don't say "The Ada people could have done it the same way as the C people."
> That is not the way anyone is teaching them to do it.

The Ada people could have done it the same way as the C people.

Sorry, couldn't resist.

And no, I wouldn't want them to.  But I would expect a disciplined,
decently-coded first increment in Ada to take 10%-30% more time, not
325% more time.  It wouldn't be done to the same coding standards as
used for a high-reliability, long-lived product like avionics, and it
shouldn't, if that isn't needed.

I will grant that one weakness in current education is teaching
people when to NOT use some methods.  It's only "engineering" if we
select the APPROPRIATE methods for the job, not if we over-engineer
the simple things (the full architectural drawings for the outhouse).

Best,
Sam Mize

[1] Barry Boehm introduced the Spiral model in ACM SIGSOFT Software
Engineering Notes in August 1986, and to a larger audience in IEEE
Computer in May 1988.  Frederick Brooks wrote that incremental
development was a promising approach for future work in "No Silver
Bullet," IEEE Computer, April 1987.

--
Samuel Mize -- [log in to unmask] (home email) -- Team Ada
Fight Spam: see http://www.cauce.org/ \\\ Smert Spamonam

ATOM RSS1 RSS2