Since there is a lot of back and forth about just how little time resolution 16/44 has, I decided to run a little test, to simulate how well a subsample delay would be encoded. I did the following: Created a waveform with a single impulse at the center of magnitude 32757. Assume that this is at 44khz (that frequency is actually irrelevant for the rest of this). Upsampled the waveform by a factor of 500 using an FIR filter. This is about 22Mhz. Delayed the waveform by one sample. That's a 45 ns delay. Downsampled the waveform by a factor of 500, taking it back to 44100. (I just did a simple sample of every 500 points, since I already know the signal is bandlimited and won't alias.) Quantized back to integer boundaries after adding in 0.5 bits white noise (primitive dither). Given that the original impulse was of magnitude 32767, this is equivalent to quantizing back to 16 bits. Upsampled the quantized waveform by a factor of 500. Compared the final upsampled waveform to the original and delayed upsampled waveforms. The final waveform was correctly delayed by a single sample at 22Mhz. I therefore conclude that under ideal conditions, the minimum encodable delay of a waveform at 16 bits and 44.1khz is no greater than 45 nanoseconds.