>very regular and highly small interval of 1/10000 if a second.  The issue
>that was raised was one that it wasn't sure Ada could perform such a small
>time interval.
   It's not a question of what Ada can do but of what a particular
implementation running under a particular OS on a particular machine,
can do.  If you are running under MS Windows, which version and on
what hardware and with which compiler?  Do you want to read time
accurate to 100mics, do "delay"s to make things happen after 100mics,
handle an interrupt every 100mics, or switch among a large number of
independent tasks, with the highest priority one getting control every
100mics?  How close to exactly 100mics must it be?