TEAM-ADA Archives

Team Ada: Ada Programming Language Advocacy

TEAM-ADA@LISTSERV.ACM.ORG

Options: Use Forum View

Use Proportional Font
Show Text Part by Default
Show All Mail Headers

Message: [<< First] [< Prev] [Next >] [Last >>]
Topic: [<< First] [< Prev] [Next >] [Last >>]
Author: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
Subject:
From:
Tom Moran <[log in to unmask]>
Reply To:
Tom Moran <[log in to unmask]>
Date:
Mon, 6 Nov 2000 15:01:42 -0800
Content-Type:
text/plain
Parts/Attachments:
text/plain (12 lines)
> How many microseconds of inaccuracy can a human hear?
If something is supposed to happen at a 5KHz or 200 mic, rate, how many
mics of random jitter does it take to produce significant noise energy
in the range of human hearing?  If you are trying to produce two
sequences at 5 and 5.01 KHz, so the beat is down at 10 Hz, you need to
generate one of them at 200 mics and the other at 199.6 mics.  If your
clock resolution is 2 mics, then you'll get one at 200 and the other at
198, for frequencies of 5Khz and 5.05 KHz, giving a nice 50Hz beat.
  That applies to direct control of a waveform.  I too am surprised that
Midi, which is control of a device that generates a waveform, has such
tight timing requirements.

ATOM RSS1 RSS2