I think most of us are missing something here.  We're talking about
something related to MUSIC.  How many microseconds of inaccuracy can a
human hear?  How long does a song have to be before the accumulated errors
add up to that many microseconds?

On top of that, all the instruments in the band should be using the same
MIDI clock, so they all have the same error, and they all finish the song
at the same time.  Who's going to notice if a song lasts 300 seconds
instead of 301?  Radio stations often get away with speeding up songs
enough to squeeze in one more 30-second commercial per half hour.

If you really do have a drift problem, try something like (type conversions
deliberately omitted):

   Microsec_Inaccuracy_Times_One_Million : Integer := ......;
   -- the difference between an actual tick interval
   -- and what the interval should be.

   Number_Of_Ticks := Number_Of_Ticks + 1;

   Next_Tick := Current_Tick + Tick_Interval +
         (( Number_Of_Ticks *
            Microsec_Inaccuracy_Times_One_Million ) / 1_000_000);

   delay until Next_Tick;

--
Write in  * Wes Groleau *  for President of the U.S.A.
I  pledge  to  VETO  nearly  everything.
http://freepages.rootsweb.com/~wgroleau