> How many microseconds of inaccuracy can a human hear?
If something is supposed to happen at a 5KHz or 200 mic, rate, how many
mics of random jitter does it take to produce significant noise energy
in the range of human hearing? If you are trying to produce two
sequences at 5 and 5.01 KHz, so the beat is down at 10 Hz, you need to
generate one of them at 200 mics and the other at 199.6 mics. If your
clock resolution is 2 mics, then you'll get one at 200 and the other at
198, for frequencies of 5Khz and 5.05 KHz, giving a nice 50Hz beat.
That applies to direct control of a waveform. I too am surprised that
Midi, which is control of a device that generates a waveform, has such
tight timing requirements.