> From: Corey Minyard
> [log in to unmask] wrote:
> >It's not a major issue for me, but my personal opinion is that
> the defaults
> >pass validation, and people who need the performance boost of turning off
> >kind of checking should have to do it intentionally.
> Well, that's probably true. In this particular case, I remember the
> GNAT documentation argues that the cost of this check is pretty high,
> and the number of times it's likely to cause a problem is pretty low, so
> the check is not high-value, but it is high cost.
Regarding the value of the check: it depends on not just on the likelyhood
of the failure, but also on the consequence of the failure. Some failures
are always "worth it" to check for even if they are highly unlikely; for
example, if a failure might be a factor in causing a nuke plant to melt
inconsequential < frictional/quality < biz.-critical < life-critical
Regarding the cost of the check: it's not the absolute cost that ultimately
matters, it's the cost in relationship to the time-criticality of the
application -- crudely put, real-time vs. non-real-time.
To omit overflow check by default is to optimize the defaults for a
real-time application where failure consequences are less than
business-critical, i.e. a game (or a benchmark :-)
My view would be that the whole notion of second-guessing the user's time-
and consequence-sensitivity should give way to the "principle of least
suprise". In this case that would mean that by default, the compiler
implements the semantics of the language.