I don't really think we need another law. I think this can easily fit under
the existing Goverment Performance and Results Act. The issue might just be
finding which agency or group wants to collect it in a central location, and
letting the others know about it. I can think of several that might be
tempted by this like STSC at Hill AFB, or DCMC Software Center of
Excellence, or maybe SEI at CMU.
Marsha S. Roepe
DCMC Lockheed Martin Marietta
From: John Apa [mailto:[log in to unmask]]
Sent: Wednesday, December 30, 1998 8:59 AM
To: [log in to unmask]
Subject: Re: Language Efficiency
You've made an excellent point. Congress should. The paperwork reduction
act requires the government to audit the amount of time spent filling
out peperwork, why not have similiar requirements for software
Does anyone have any contacts in Washington that we could use to
initiate such discussion/legislation? The big push to make government
efficient has not really happened as was promised, maybe this is a
suitable topic for our representatives to begin discussing. Maybe not as
entertaining as the current scandal but certainly of practical worth to
the country as a whole.
It should be an achieveable task if we limit the discussion to just that
of improving the efficiency of software procurement and management.
There are many examples that draw attention to the problem. I think it'd
be a success if we were able to get a discussion started.
I'd encourage everyone to contact their representatives if it hasn't
already been done.
All great expeditions begin with a single step....
John T Apa [log in to unmask]
L-3 CSW (801) 594-3382
640 North 2200 West PO Box 16850 Salt Lake City, UT. 84116-0850
>From: Robert C. Leif, Ph.D. [SMTP:[log in to unmask]]
>Sent: Wednesday, December 30, 1998 8:36 AM
>To: [log in to unmask]
>Subject: Re: Language Efficiency
>To: Robert Eachus et al.
>From: Bob Leif, Ph.D.
>I like your data. However, it is still anecdotal. It is possible to do a
>cross-over study. However, as we agree, it can not be blind.
>The bottom line is that the paucity of data clearly demonstrates that good
>manufacturing processes are NOT being followed in the software field. I
>suspect that by now must of us agree that the Ada mandate should have been
>replaced by the DoD being forced to keep decent data. If CMM can be applied
>to the contractors, why not force the Government agencies to employ ISO or
>some other reasonable standard for software acquisition. This standard
>should include obtaining and maintaining total cost and reliability data.
>Congress should require that DoD and other Government agencies analyze the
>results of their previous software practices and create a database for
>monitoring throughout their lifecycle all future software projects.
>> -----Original Message-----
>> From: Team Ada: Ada Advocacy Issues (83 & 95)
>> [mailto:[log in to unmask]]On Behalf Of Robert I. Eachus
>> Sent: Tuesday, December 29, 1998 11:11 AM
>> To: [log in to unmask]
>> Subject: Re: Language Efficiency
>> At 04:49 PM 12/26/98 -0800, Robert C. Leif, Ph.D. wrote:
>> >Fortunately, our universe is restricted to software development.
>> >Unfortunately, I do NOT believe that a true double blind
>> crossover study is
>> >even conceivable. This would require the same project being developed in
>> >both Ada an another language. However, I believe that it is impossible
>> >because there is no straight-forward way to organize an experiment where
>> >neither the monitor (teacher) nor the student (user) know which
>> >language they are using.
>> Okay, I have to put in my own two cents as a statistician. ;-)
>> traditional single or double blind studies are possible because
>> in any case
>> the patient (programmer) knows what language he is writing in. But it is
>> fairly easy to do a study where neither placebo or Hawthorne
>> effects occur.
>> The important thing is not to have the same programmer, or programming
>> team writing the software in two languages--each sample has to
>> approach the
>> problem as new. Basically this has occured at several places, probably
>> most conclusive was at SUNY Albany.
>> My "study of studies" synthesis of what I have seen is that Ada (83)
>> about five times more productive from design through integration and test
>> than C or Fortran. But putting on my Operations Research hat instead, the
>> interesting--and totally obvious in hindsight--conclusion is one that I
>> have stated here many times. The major difference is that in Ada, coding
>> is done means both that and that the software is very nearly complete.
>> (Testing remains, and usually some GUI tweaking.) In other languages, it
>> is not unusual for 90% or more of the source lines to be changed after
>> "coding is done." On one (Fortran 77) project I remember vividly because
>> we had the numbers, there were 560 KSLOC of changes to a (final)
>> 300+ KSLOC
>> project that was just over 200 KSLOC when "coding was done." On
>> the other
>> hand, I worked one Ada program where there were 17 lines changed between
>> handover to integration and test and six months after fielding. (And
>> half of those changes were clarification of error messages or correcting
>> typos in same.)
>> I know I'm preaching to the choir, but it took a lot of getting used
>> fifteen years ago. We were converting some tools from Multics to
>> the DPS6,
>> and needed to translate them from PL/I to Ada. Even though we built
>> several translation tools to help in the process, most of the "hand work"
>> time was spent dealing with cases, error or otherwise, that the PL/I
>> programmer hadn't considered. Most of those changes were backfit into
>> PL/I, and they became "just as good" as the Ada versions. Our conclusion
>> was that, in other languages it can be up to ten times as
>> expensive to turn
>> out product quality code than a one-off kludge, but in Ada, the
>> is under 50%.
>> Robert I. Eachus
>> with Standard_Disclaimer;
>> use Standard_Disclaimer;
>> function Message (Text: in Clever_Ideas) return Better_Ideas is...