TEAM-ADA Archives

Team Ada: Ada Programming Language Advocacy

TEAM-ADA@LISTSERV.ACM.ORG

Options: Use Classic View

Use Monospaced Font
Show Text Part by Default
Condense Mail Headers

Topic: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
Sender: "Team Ada: Ada Advocacy Issues (83 & 95)" <[log in to unmask]>
Date: Fri, 9 Oct 1998 10:18:34 -0500
Reply-To: Samuel Mize <[log in to unmask]>
From: Samuel Mize <[log in to unmask]>
Content-Transfer-Encoding: 7bit
In-Reply-To: <[log in to unmask]> from "W. Wesley Groleau x4923" at Oct 9, 98 09:36:05 am
Content-Type: text/plain; charset=US-ASCII
MIME-Version: 1.0
Parts/Attachments: text/plain (65 lines)
W. Wesley Groleau x4923 wrote:
> > Pro: Parallel constructs built in to the language.
> > Con: Tasking code is large.
>
> Is this really true?  And even if it is, how much smaller is code in
> another language containing calls to "outside" multi-threading libraries?

A more correct "con" statement would be:

  Con: Some compilers (especialy older ones) include a huge run-time
       library in each executable, even for small and simple programs.

GNAT currently does NOT do this.  I believe Apex used to, but I
haven't used that product for 3+ years.

> > Pro: Strict bounds rules allow compiler to build in automatic
> >      "debugging" software.
> > Con: Unoptimized code is large and slow.
>
> Is it really?  Empirical evidence?

I can only provide anecdotal evidence.  Turning off bounds checking
made a measurable difference with GNAT, even optimized.  As I recall,
optimization with GNAT commonly made a 20-50% speed difference,
sometimes over an order of magnitude difference.  Some of this is
analyzing out redundant run-time checking; some is general
optimization of the code.

The joker in the deck is that (according to online discussions
I've seen by compiler developers) Ada's strong typing, and some of
its other features, actually allow for much stronger optimization IF
the user is taking advantage of them.  But they make it harder to
"hand-optimize" code.

So an old C hacker who expects his Ada code to be slow, and a
well-trained Ada developer who expects his Ada code to be fast,
will both be correct.  :-)

Even in C, with a good compiler, you're better off with machine
optimization.  I had to fix some C code once that did manual address
arithmetic to avoid the "overhead" of array notation.  In order to
figure out what the BLEEP the code was accomplishing, I had to turn it
back into the array notation.  I just couldn't decrypt it otherwise.
As a lark, I timed it both ways to see just how much execution time
they had saved with all that effort.  IIRC, the unoptimized times
were within observational limits of each other.  Compiled with full
optimization, the hand-optimized code was measurably SLOWER.  The
compiler was (of course) better at factoring out common sub-expressions.

But, C compilers are many and cheap.  Some are bound to have such poor
optimizers that hand optimization is necessary.  Some are bound to
have such poor optimizers that using them generates buggy code.  So,
many C developers routinely hand-optimize their code.

Since C developers view minimal-quality compilers and hand optimization
as normal, they view good-quality Ada compilers as expensive and hard
to work with.

Best,
Sam Mize

--
Samuel Mize -- [log in to unmask] (home email) -- Team Ada
Fight Spam: see http://www.cauce.org/ \\\ Smert Spamonam

ATOM RSS1 RSS2