TEAM-ADA Archives

Team Ada: Ada Programming Language Advocacy

TEAM-ADA@LISTSERV.ACM.ORG

Options: Use Forum View

Use Monospaced Font
Show Text Part by Default
Show All Mail Headers

Message: [<< First] [< Prev] [Next >] [Last >>]
Topic: [<< First] [< Prev] [Next >] [Last >>]
Author: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
Subject:
From:
"W. Wesley Groleau x4923" <[log in to unmask]>
Reply To:
W. Wesley Groleau x4923
Date:
Mon, 6 Nov 2000 14:58:12 -0500
Content-Type:
text/plain
Parts/Attachments:
text/plain (31 lines)
I think most of us are missing something here.  We're talking about
something related to MUSIC.  How many microseconds of inaccuracy can a
human hear?  How long does a song have to be before the accumulated errors
add up to that many microseconds?

On top of that, all the instruments in the band should be using the same
MIDI clock, so they all have the same error, and they all finish the song
at the same time.  Who's going to notice if a song lasts 300 seconds
instead of 301?  Radio stations often get away with speeding up songs
enough to squeeze in one more 30-second commercial per half hour.

If you really do have a drift problem, try something like (type conversions
deliberately omitted):

   Microsec_Inaccuracy_Times_One_Million : Integer := ......;
   -- the difference between an actual tick interval
   -- and what the interval should be.

   Number_Of_Ticks := Number_Of_Ticks + 1;

   Next_Tick := Current_Tick + Tick_Interval +
         (( Number_Of_Ticks *
            Microsec_Inaccuracy_Times_One_Million ) / 1_000_000);

   delay until Next_Tick;

--
Write in  * Wes Groleau *  for President of the U.S.A.
I  pledge  to  VETO  nearly  everything.
http://freepages.rootsweb.com/~wgroleau

ATOM RSS1 RSS2