TEAM-ADA Archives

Team Ada: Ada Programming Language Advocacy

TEAM-ADA@LISTSERV.ACM.ORG

Options: Use Forum View

Use Proportional Font
Show Text Part by Default
Show All Mail Headers

Message: [<< First] [< Prev] [Next >] [Last >>]
Topic: [<< First] [< Prev] [Next >] [Last >>]
Author: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
Subject:
From:
Nick Roberts <[log in to unmask]>
Reply To:
Nick Roberts <[log in to unmask]>
Date:
Wed, 6 Feb 2002 18:29:06 -0000
Content-Type:
text/plain
Parts/Attachments:
text/plain (74 lines)
"Thomas A. Panfil" <[log in to unmask]> wrote on Wednesday, February 06, 2002
4:01 AM:


> Hi All,
>
> I'd like to be able to cite a good paper on why Buffer Overflow
> susceptibility is common in software written in some popular
> language(s), and rare or relatively easy to prevent when using
> other languages.  Advice, anyone?

Some hopefully relevant points.

For a buffer overflow vulnerability to be actually exploitable, it is
necessary for:

(a) the underlying operating system or execution environment to fail to
provide or deploy protection against the execution of code that lies in an
area of memory which is read-write [1];

(b) the underlying operating system or networking software configuration to
fail to isolate the executional environment of the (TCP) service application
to the maximum extent feasible [2];

(c) the service application to have a twofold combined kind of bug, whereby
the client is able to write binary data into a certain area of the service
application's memory, and then cause somehow the service application to
start the execution of instructions somewhere within that area of memory
[3].

[1] The C or C++ language often prevents the use of such protection, even
when it is available (at no executional cost) on the architecture. While
some forms of protection can be used, others cannot (because of C's need for
a 'flat' address space). Ada does not require a flat address space (but
typically suffers from the limitation of having to interface to C software
to be able to use operating system specific functions, of course).

[2] This especially pertains to running the service application as a normal
user (rather than root), and ensuring that user has the minimal (file)
permissions necessary to do its job. A typical situation on most UNIX-based
operating systems, unless the system administrator is very sophisticated,
and an even more typical situation on Windows NT (with IIS), is that such
elementary precautions are not taken.

[3] This is a theoretically extremely unlikely bug, that nevertheless
demonstrably tends to crop up within (large) C and C++ software, and to my
knowledge never in software written in any other language.

The thing that is most deadly about a successful buffer overrun exploit
attack (and similar types of attack), is that the attacker gets to run his
own code, often with root privileges, and can thus truly "do anything he
likes" from that point on. Typically the host computer is totally
compromised; this in turn (if the attacker is skilled and persistent) can
lead to whole networks being compromised.

I believe there is a negligible likelihood of a TCP or UDP service
application written in Ada, especially with most or all checks left on,
suffering from a buffer overrun vulnerability, or any vulnerability that
permits the client to cause it to execute arbitrary code. This is regardless
of the compiler, the host machine (architecture), and the host operating
system. (Note however that this is not to be confused with the case of an
Ada main program using substantial library code written in C.)

A good starting point for more information may be the US DoE Computer
Incident Advisory Capacity (CIAC) at:

http://www.ciac.org/ciac

I believe that with a bit of digging you will find much fodder for your
research. Happy hunting!

--
Nick Roberts

ATOM RSS1 RSS2