TEAM-ADA Archives

Team Ada: Ada Programming Language Advocacy


Options: Use Forum View

Use Proportional Font
Show HTML Part by Default
Condense Mail Headers

Message: [<< First] [< Prev] [Next >] [Last >>]
Topic: [<< First] [< Prev] [Next >] [Last >>]
Author: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
"Team Ada: Ada Advocacy Issues (83 & 95)" <[log in to unmask]>
Jesse Farmer <[log in to unmask]>
Sat, 4 Nov 2000 23:23:19 -0800
Tom Moran <[log in to unmask]>
Tom Moran <[log in to unmask]>
text/plain; charset=us-ascii
Decision Aids
text/plain (49 lines)
"delay 0.003448275862"
  The PC clock chip ticks at 1.19318 MHz, or every 0.838 microseconds.
4114 of those ticks take 3447.9290 microseconds.
4115                take 3448.7671
and you want to delay    3448.275862 mics
It can't exactly be done.  It also takes some time to do the OS
call, read the clock chip, and to compare against the "end of
delay" time, further blurring precision.  This is not a function of
Ada, but simply of CPU and IO and OS speed.  If the timing is done
not by reading the clock chip (the "system performance counter" in
MS Windows parlance), but instead by waiting for OS timer
interrupts and possibly doing task switching, things can get *much*
worse.  Under DOS, for instance, the OS's clock ticked only once
every 65536*0.838= 54919.168 or almost 55 milliseconds.

  I recently had occasion to test some of the relevant timings with
4 different Ada compilers under Win95, Win98, W2000, and WinNT, on
various Pentiums around 200 - 300 MHz.  (I sent the clock timing
program to a few days ago, so you should soon be
able to download and try it on your own configuration.  It's pure
Ada and should work on any system.  I'd be curious to hear the
results on non-Windows systems.)  The time to execute "t :=
ada.calendar.clock;" ranged from 2.4 to 22.4 microseconds, so your
"1-2 microsecond" requirement might push, but should be doable on a
modern fast machine.  The time for "delay 0.0;" ranged from 1.4 to
9.0 mics, except for one system, which apparently did not
special-case a zero-length delay, which took 10000.0 microseconds.
The time for a "delay 0.001" ranged from 976 to 3827 to 10,000
microseconds.  The "976" was on a system with a duration'small of
1/4096 = 244 mics.  The longer times were clearly depending on the
OS provided timing facilities.  My guess is that your 90 second
times for 17400 executions of a small, but non-zero, delay, were
caused by a system with a minimum "delay" time of 5ms.

  So running separate tasks and using "delay" statements is
probably not going to work for high speed/high accuracy.  You'll
need to read the clock.

  But my understanding of MIDI is that you talk to devices through
a MIDI port, which is essentially a (fairly slow) serial comm port.
I would think the important timing would come from the serial
port's clock, not from how long it takes the CPU and software to do
things (as long as they are "fast enough").  No?

  Some years ago I built a video editor.  Video timings are not as
fast as the ones you need, of course, but then today's chips are
rather faster than a 16MHz 386.  Sounds like you have an
interesting project.