TEAM-ADA Archives

Team Ada: Ada Programming Language Advocacy

TEAM-ADA@LISTSERV.ACM.ORG

Options: Use Forum View

Use Monospaced Font
Show Text Part by Default
Show All Mail Headers

Message: [<< First] [< Prev] [Next >] [Last >>]
Topic: [<< First] [< Prev] [Next >] [Last >>]
Author: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
Subject:
From:
"W. Wesley Groleau x4923" <[log in to unmask]>
Reply To:
W. Wesley Groleau x4923
Date:
Tue, 7 Nov 2000 09:11:36 -0500
Content-Type:
text/plain
Parts/Attachments:
text/plain (43 lines)
> > How many microseconds of inaccuracy can a human hear?
>
> If something is supposed to happen at a 5KHz or 200 mic, rate, how many
> mics of random jitter does it take to produce significant noise energy
> in the range of human hearing?  If you are trying to produce two
> sequences at 5 and 5.01 KHz, so the beat is down at 10 Hz, you need to
> generate one of them at 200 mics and the other at 199.6 mics.  If your
> clock resolution is 2 mics, then you'll get one at 200 and the other at
> 198, for frequencies of 5Khz and 5.05 KHz, giving a nice 50Hz beat.
>   That applies to direct control of a waveform.  I too am surprised that
> Midi, which is control of a device that generates a waveform, has such
> tight timing requirements.

This is irrelevant to the question.  The MIDI clock is used to time notes,
not waves within a note.

> > How long does a song have to be before the accumulated errors add up to that many microseconds?
>   Part of stereo sound perception is from the delay between the time a
> sound reaches your two ears.  At 1000 ft/sec, that's about one
> millisecond.  Suppose two waveforms are controlled by two separate
> computers, whose clocks drift apart by 1 minute/day (I've had worse).
> They will differ by one millisecond after 1.44 seconds of elapsed time,
> so one could imagine the sound of two instruments with one of them
> appearing to be running from stage left to right and back again every 3
> seconds.  Noticeable? ;)

Again, if you're talking phase and wavelength, irrelevant.  In terms of
the time the note begins in one ear and the time it begins in the other
ear, it could be relevant.  However, if MIDI is controlling two
instruments, each instrument's perceived location is going to be based on
the difference between the sounds arriving at your ears--not by the time
the sound leaves the ONE instrument.

A MIDI stream is like a conductor's score, not like an oscilloscope screen.
The MIDI clock is like the conductor's baton.  And the original question
(in my opinion) is like complaining that some conductors do not wave their
arms with sufficient precision.

--
Write in  * Wes Groleau *  for President of the U.S.A.
I  pledge  to  VETO  nearly  everything.
http://freepages.rootsweb.com/~wgroleau

ATOM RSS1 RSS2