TEAM-ADA Archives

Team Ada: Ada Programming Language Advocacy


Options: Use Classic View

Use Monospaced Font
Show HTML Part by Default
Condense Mail Headers

Topic: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
Sender: "Team Ada: Ada Advocacy Issues (83 & 95)" <[log in to unmask]>
From: "Mike F. Brenner" <[log in to unmask]>
Date: Tue, 18 Mar 1997 14:53:13 -0500
Reply-To: "Mike F. Brenner" <[log in to unmask]>
Parts/Attachments: text/plain (45 lines)
I am very interested in this question of good SW development practices, but
the figures you gave had no think time: engineering requirements analysis and
analyzing the impact of change. Are your figures the amount of time they
spent in various tools? If so, do you have access to the amount of time they
charged to the project when they were not in these tools, which we could
allocate 50 percent to overhead and 50 percent engineering analysis, since
most spend little time analyzing the impact of their changes. Which is why
most bugs are inserted by the software maintenance process, propogated
through the globally visible variables and interfaces.

In addition, there is no CM tool time given. That is curious because
usually the only way to measure activities is through a process model
tool, like a commercial CM tool.

Maybe the key is the words THEY SAY:
    > they say that they spend 20% of their time writing the
    > initial code. 80% of their time debugging their code

If the basis of comparison is what they say they do, and not actual
measurements, then what is the purpose of answering them?

The Ada projects I have personally worked on were in the 10K to 100K range,
although multiple baselines and multiple projects executed at one time
sometimes made them look like much larger projects. Ada, of course, has
tools to help break a project down in to smaller chunks. In these projects,
the largest part of the work was analysis (about 50 percent), the second
largest part (about 40 percent) was certification testing for the user, u
and a minor part of the project time was spent designing, coding,
unit testing, integrating, debugging, documenting, and managing the

It is very hard to get details on coding versus unit testing versus
debugging because they take up such a tiny part of the life cycle of
software projects. However, it is quite well publicized that for
most DOD projects the independent testing cycle (not including what
the programmer does) takes up a big chunk of the time, near 40 percent.

What is interesting is the theory that if we spend more time
analyzing the impact of change, we would have less testing to do,
because fewer future changes will be needed to the software.

What is even more interesting is that the amount of time maintainers
spend in each life cycle process activity is not being measured, in