Digital coax cable - BNC vs. RCA *

Discussion in 'Audio Hardware' started by Chris Desjardin, Feb 1, 2016.

Thread Status:
Not open for further replies.
  1. Chris Desjardin

    Chris Desjardin Senior Member Thread Starter

    Location:
    Ware, MA
    I recently purchased a Little Dot DAC_1, and need to connect it to my sound card with a digital coax cable. The sound card has an RCA output for digital, and the DAC has both RCA and BNC inputs. I have heard that BNC is superior, but I wondered if I should get a cable with RCA on one end and BNC on the other (I found one that is available this way).

    Looking for advice - would I be better off keeping both ends RCA, or would the BNC on one end provide some benefit.

    All opinions welcome...

    Thank you.
     
  2. Captain Wiggette

    Captain Wiggette Forum Resident

    Location:
    Seattle
    For SPDIF, it really should be okay either way. I would not spend an exorbitant amount of time or money on an RCA-BNC cable, if a good quality 75ohm RCA-RCA cable is easily attainable.

    That being said, you did hear correctly that BNC is superior for this application that requires 75ohm. You cannot make a 75ohm RCA connector, so the use of RCA is a compromise (fundamentalists might say it's simply wrong, but that's going overboard IMO) for the common consumer. The BNC connector is the correct choice as 75ohm BNC jacks and plugs are actually 75ohm (FYI: there are BNC connectors at other impedances as well, so do check you're not mistakenly buying antenna-style BNC connectors which can be like 50 and other things). While having BNC on one end will help from a technical standpoint, really for digital audio like this, the bandwidth is pretty low and the theoretical signal benefit is negligible, and in any case won't make a difference to the actual 'reception' of the signal at the receiving end (what you end up hearing). Basically, it won't have any impact as long as everything is working, which it should whether you have RCA on both ends or RCA-BNC. If you were using extremely long cable runs (hundreds of feet), and were plugging 10 shorter RCA cables together with barrel connectors all using RCA, then you might start running into problems.

    Tl/dr: RCA-RCA is fine. If you can get RCA-BNC for the same price, get RCA-BNC simply out of principle.
     
  3. jea48

    jea48 Forum Resident

    Location:
    Midwest, USA
    RCA on both ends is fine, not to mention a bigger selection of digital coax cables to choose from.

    FWIW
    It may not make any difference in your case, but if you do go with RCA on both ends, listen to the cable installed in one direction then flip it end to end and listen to it in that direction to see if YOU can hear any difference. IF you do hear at difference mark the cable for future ID for the direction that sounds best.

    As always, YMMV.
     
  4. Black Elk

    Black Elk Music Lover

    Location:
    Bay Area, U.S.A.
    Are you sure the BNC is for S/PDIF and not unbalanced AES?

    https://en.wikipedia.org/wiki/S/PDIF

    There are two advantages to BNC: (1) it has a characteristic impedance (50 or 75 ohms depending on use), and (2) is a locking (bayonet) connector.

    It should make no difference whether you go RCA or BNC, and you don't really need a 75 ohm cable either if you are only spanning a few feet. S/PDIF is a 'slow' signal, about 3 MHz clock rate at 48 kHz, so about 12 MHz for 192 kHz. The characteristic impedance, connector type/impedance and cable reflections become an issue when the frequency gets up into the VHF/UHF spectrum, which is much, much higher in frequency. If I was you, I'd stick with RCAs for the most compatibility down the road.
     
  5. Chris Desjardin

    Chris Desjardin Senior Member Thread Starter

    Location:
    Ware, MA
    Thanks, everyone! I am going with the RCA cables - the 2 devices I am connecting are only about a foot away from each other.
     
    BuddhaBob likes this.
  6. jea48

    jea48 Forum Resident

    Location:
    Midwest, USA
    Make sure the coax cable is at least 1.5 meter long.
    Or they can be short, less than a coupe feet.
     
  7. Black Elk

    Black Elk Music Lover

    Location:
    Bay Area, U.S.A.
    ??

    Where do people get these notions?
     
  8. jea48

    jea48 Forum Resident

    Location:
    Midwest, USA
    Why do some people go through life with their head buried in the sand?
     
  9. jea48

    jea48 Forum Resident

    Location:
    Midwest, USA
    Correction:
    Or they can be short like 6" or 8" .
     
  10. Captain Wiggette

    Captain Wiggette Forum Resident

    Location:
    Seattle
    I was just assuming it was SPDIF. Either way, pretty much the same deal. :righton:
     
  11. Black Elk

    Black Elk Music Lover

    Location:
    Bay Area, U.S.A.
    Instead of burying my head in the sand, I buried it in books and journals on the subject, gaining a Ph.D. degree on the propagation of wide bandwidth UHF radio waves. I have considerable experience using coaxial cables at radio and audio frequencies.

    So, since you ducked the question, what is the reasoning behind your suggestion on S/PDIF cable length?

    I couldn't find a manual online, so could not determine whether the input was S/PDIF or unbalanced AES. I'm not a fan of using non-standard connectors for this reason. Although S/PDIF and AES and almost the same at the data level, they are different in voltage levels, so there is a potential for problems.
     
  12. Captain Wiggette

    Captain Wiggette Forum Resident

    Location:
    Seattle
    Tread carefully, engineers and knowledge are not looked upon kindly here.
     
  13. jea48

    jea48 Forum Resident

    Location:
    Midwest, USA

    From a previous post.
    Black Elk Said:
    ??

    Where do people get these notions?


    Sorry, that was a question? I thought you were just being sarcastic.


    >>>>>>>>>>>>>>>>
    Quote from Link below:

    Many of you may have heard or read that it is beneficial to use at least a 1.5m length digital cable from your Transport to your DAC. There are actually technical reasons for this, but the requirement also depends on the behavior of the signal from the S/PDIF digital output on your Transport. It turns-out that the jitter on the digital signal can increase if the cable is too short, and the increased jitter can cause the audio to have "halos" or be out of focus.
    S/PDIF (or Sony/Philips Digital Interface) is a digital signaling standard specified at 75 ohms characteristic impedance and terminated on both ends. This means that the source driver (in the Transport) must have an output impedance of 75 ohms and the receiver (in the DAC) must have a parallel resistive termination of 75 ohms. If these terminations are both set to 75 ohms, then ideally the signal will propagate from the Transport to the DAC and no reflections will occur on the transmission-line that connects the Transport to the DAC, assuming that all components of the transmission-line are also 75 ohms. The transmission-line components (excluding the driver, receiver and terminations) include:

    See the rest of text in the Link.
    http://www.positive-feedback.com/Issue14/spdif.htm

    I would be happen to supply you more info on the subject from a couple well know, well respected, EEs that also have a background on the subject that frequent the Audiogon Forum.

    Best regards,
    Jim
     
    Last edited: Feb 4, 2016
  14. John Moschella

    John Moschella Senior Member

    Location:
    Christiansburg, VA

    For this application you are right, it really should not make a difference as long as the connectors are good quality.

    Canare tried to make a true 75 Ohm RCA and did a pretty good job of it as the freq response is quite high for a RCA. I use these connectors for my video cables.

    FYI. I know that there are "75 Ohm" BNCs, but this connector was designed as a 50 Ohm connector back in the 50s and is the standard connection for lots of measuring equipment like oscilloscopes. The 75 Ohm BNC is really a kludge job as they have to hollow out the insulator in the connector to try to get to 75 Ohms. The result is pretty good but the freq response is better for the 50 Ohm connector mated to 50 Ohm cable.

    Now what is really interesting is that consumer audio (and video) equipment exists with BNC connections using 50 Ohm panel jacks. I saw this on a Mark Levinson DAC many years ago and it gave me a laugh. You can tell very easily by looking at them the quasi 75 Ohm version has less plastic. Here is a good example:

    [​IMG]
     
  15. Black Elk

    Black Elk Music Lover

    Location:
    Bay Area, U.S.A.
    The ?? is to question the reasoning for your suggestion, the second question is to determine where such a suggestion originated.


    That PF article is nothing but scary-sounding nonsense. What is the magnitude of the reflected signal relative to the signal at the S/PDIF transmitter? How many reflections accumulate that can be considered to cause self-interference? Where is the mathematics describing what is happening? Where is the modeling of the behavior to show the expected result as a function of cable length? Where are the physical measurements? Where are the references to external, peer-reviewed journals?

    They claim that the jitter level will be increased, so where is the proof? They should be able to measure the change in jitter level at the S/PDIF receiver input as a function of cable length. So, where are the measurements?

    Are they aware that the S/PDIF interface has a defined jitter margin? If I am remembering correctly, it is +/- 20 nanoseconds for 1 Fs (44.1 kHz/48 kHz) transmission, which is what the vast majority send over S/PDIF interfaces.

    Let's assume that they are right, what is the impact on the output of the DAC? The 'increased jitter' is certainly not sufficiently high to induce bit errors, as we would hear that straight away, especially in compressed audio when using S/PDIF for DTS, Dolby-Digital, etc. (you accept that there are many audiophiles using 1 m S/PDIF connections on a daily basis?). It would also be easy to show (as has been done on previous occasions on this very Forum) that the data received via S/PDIF matches the ripped values from CD (for example).

    So, if there is no change in data values, is there any change in the jitter spectrum, noise levels, distortion, etc. at the output of the DAC? Again, they provide no measurements to support their claim. Even the likes of Stereophile, TAS, etc. have had to accept (following their own measurements) the claims of modern DAC designers that their units are excellent at rejecting transmission jitter, and impervious to cable changes.
     
    SandAndGlass likes this.
  16. jea48

    jea48 Forum Resident

    Location:
    Midwest, USA
    Black Elks,

    Thanks for taking the time and laying out your reasoning why you think Steve Nugent's comments in your words " is nothing but scary-sounding nonsense".
    It would be nice if Mr. Nugent were here to defend his statements for the record.

    For your consideration if you would please read this old Stereophile article from 1993
    titled A Transport of Delight: CD Transport Jitter. Old? Yes, but for the time it shows the differences in transports and their reactions they have on the transmission of the digital signal from the transport through the digital coax cable, with RCA ends, to the DAC. No you will not read anything about the length of the digital coax cable. It would have been nice if they had given the length. Maybe in 1993 that was not a consideration. Don't know.... You will find why they were surprised to discover the cable was directional and their explanation of the why.
    http://www.stereophile.com/features/368/index.html#i0yTBMrZJsTZYoGm.97

    My come away from the article? Not all transports are created equal. And Stereophile provided lots of measurements and graphs to show why. Do I understand all the technical talk? No, but I do know from personal listening experience that CD transports do not all sound alike. Nor do all digital coax cables with RCA ends sound alike. And from my experience digital coax cables with RCA ends are directional.

    Next for your consideration. I ran across this rambling long thread that talks about the same issues you raised with Steve Nugent comments in the PF article.
    Yes, I read the whole thing all the way through.

    Finally the last page, this post by Jacko Homo.

    Quote:

    I am going to abuse my powers, as whatever titular power I retain, and temporarily unlock this thread. Long enough to add some clarification, as one of our better members has linked it. We are undoubtedly going to pick up lots of folks, and since we are not going to have this thread open, I am going to try an anticipate questions they will have. Answer them in advance, and lock the thread.

    Long vs short?

    There is no simple answer. So, let me give you examples of where one will be better than the other.

    Situation #1.) Good return loss on each end. By good, I mean <-30 dB. (That is around 3% reflection, from both ends.) In this case, you can use pretty much any cable. A longer one will have more pulse dispersion, and may not be the best choice. Even if you have a slow rise time, you can pretty much get by with any cable, as you are only going to have a few percent of the signal bouncing back from the RX end, and a few percent of that value getting back to the RX end.

    (You really only have to worry about the first reflection. Forget and the 82nd or 101st reflection. They are below the noise level, even with a short or open, at the far end.)

    Situation #2.) Poor return loss on each end. Something from -8 db, to something close to -20 dB. (40% reflection, down to 10% reflection.) If the cable is very short, you need to worry about reflections. The first reflection will have enough energy to upset the rise time. Because of the amount of energy reflected back, it will land in a part of the waveform that will change its rise time characteristics.

    Now, here is where you need to know more info than most of you have at your disposal. Not only do you not know the reflection coefficient, you probably don't know the rise time, of the TX end.

    The slower the rise time, the more you have to worry about reflections. The longer it takes the waveform to get from 20% to 80% (that is how rise time is defined), the more likely a reflection can upset things.

    So, if you have a slow rise time (as is not uncommon, as a lot of large manufacturers cripple the output, to avoid EMI problems), then you need to have a longer cable, than you would with a hot-rod DIY unit, where you do not care about EMI. On top of that, those crippled outputs almost always have really poor impedance control. (No, the mega-big mulit-national conglomerates really don't care about "high-performance". Only meeting international standards.)

    If this is what you have, then in this case, long cables are the only way to go. Short cables are a nightmare. Stop and think about this: if your rise time is 12 nSec, your cable has a 70% velocity of propagation, how far of a cable can you have if the 20% point of the waveform gets to the far end, when the 80% portion is just fixin' to leave. (That is your homework assignment.) Do the same for 3 nSec. Now do you see why rise time is important, as well?

    To recap: #2, poor RL on both ends, slow rise time, long cable essential. If you can speed up the rise time, the cable length can be reduced.

    What if you have good RL on one end, but crappy or the other?

    Situation #3.) This one doesn't have as neat an answer. If you have that -30 dB RL on one end, we are only talking 3% or so, bouncing back to the source. Let's say it is a crappy source, and 25% bounces back. Ok........0.03 * 0.25 = 0.0075. Or -43 dB. Probably not enough to worry about, but it could be. (Remember, in #1, I postulate that you need -30 dB, on both ends. This means the total amount of the first reflection,to arrive at the RX end, is (-30 db) + (-30 dB) = -60 dB. Or 0.1 percent. So, -43 db.........pretty good, but could be a tad better, for the cable to be irrelevant.

    So, which is better, in this case? Well, you really need to know how fast the rise time is. We are in an area, where the trade-off between pulse dispersion (slowing down of the pulse), and the effect of the reflection effectively slowing the rise time, are tough compromises.

    If it were my system, and the rise time was slow, the pulse dispersion may not make enough of a difference to make a difference. So, I would opt for the long one. If I knew the rise time was really fast, the laws of physics say the length of the cable can be shorter, with all other factors being the same. So, it is possible to get by with 1m or so. With 2m, if the cable has good BW, it won't slow down the pulse enough, to be noticeable. So, I would go with 2m, as it would give you some safety margin.

    OK, one last point.

    There is lots of talk about some mythical cable, commonly known as Jocko's cable. It is not the cable some have linked to. That cable(s) may have derived from it, but are not the same thing.

    Jocko's cable is a very long, low BW cable, designed to work in condition #2. Not #1, not #3.

    Why is that?

    Simple.

    At the time it was designed, it was not uncommon to see really horrible RL, coupled with really slow rise times. In this condition, a long cable was called for. A cable with low BW was chosen, as it exhibited almost no difference in rise time, regardless of impedance control, on either end. You could say it was a band-aid, to allow anyone to stick a cable in any system, with any combination of equipment, and it would perform the same in all of them.

    18 years later, I do not think things have improved much. So, a long cable is one option to control reflection problems.

    An attenuator is a simpler solution. See my comments here http://www.diyaudio.com/forums/digital-source/168901-rf-attenuators-jitter-reducers-4.html#post2222770 for clarification on how attenuators work.

    You guys are free to try any and all solutions. The one that sounds best is the best one to use. But, there are many ways to skin the cat.

    Helps to know if it is an alley cat, or a lion, before you try to skin it.

    OK, thread closed again. Hope this subject is put to rest.

    http://www.diyhifi.org/forums/viewtopic.php?f=2&t=1943&start=75

    Black Elk,
    Your thoughts?

    So can you really say it doesn't matter if the cable is 3/4 m, 1m, 1.5m, or 2m long?
     
    Last edited: Feb 5, 2016
  17. jea48

    jea48 Forum Resident

    Location:
    Midwest, USA
    Black Elk,

    My appoligies for the miss spelling of your username in the previous post. It was not intentional.

    For your consideration.
    Here is an Agon Thread where the OP is asking for help because he is experiencing dropouts in the sound from his audio system.

    Just curious what you think was causing the digital dropouts and the fix that solved the problem.
    Jim
     
  18. Nate

    Nate Forum Resident

    To this non EE, the 1.5 m approach sounds reasonable. Certainly no harm in going to 1.5 m
     
  19. Oggy

    Oggy Forum Resident

    Location:
    Cambridge, England
    I made my own a few years ago, 75
    Ohm coaxes both end with a 75 Ohm cable. It was very cheap, worked well, but was a lot shorter than 1.5m.
     
  20. Black Elk

    Black Elk Music Lover

    Location:
    Bay Area, U.S.A.
    OK, took a little while to find the time to read everything, but here goes.

    1. Stereophile article

    My better judgement told me that I should have stopped reading when I saw the words 'Robert Harley' but I did read all 10 pages of the article. In short, what did he show:

    a. that the jitter level at the output of ONE specific DAC varies with the jitter level of the attached transport;
    b. different transports can have different jitter levels;
    c. some anti-jitter devices are not a 'cure' for all transports.

    What did he claim?

    d. the change in jitter levels of the transports are 'clearly' audible;
    e. lower jitter transports, in general, are better;
    f. we need to consider video-type transmission solutions to minimize cable problems.


    Firstly, it should come as no surprise that different S/PDIF interfaces would have different transmission jitter levels, given the variety of crystal oscillators that can be used in players/transports. We also know that different solutions in DACs are able to reject more/less of the transmission jitter as a result of the decoupling between incoming data and DAC master clock. We also know the mathematical equation which links word-length to maximum sampling clock jitter to ensure true X-bit performance at a given frequency (usually only 20 kHz is considered). That equation shows that the sampling clock jitter must be 121 picoseconds, or less, for a 20 kHz signal at 16 bits for the sampling clock jitter to not introduce distortion. That jitter level is significantly smaller for longer word-lengths, or if you consider a higher maximum frequency to be reproduced distortion-free. The transmission jitter is entirely removed in the S/PDIF receiver, but that jitter can modulate the DAC master clock if the design allows it.

    Harley likes to make all sorts of claims about audibility of jitter, but he makes no reference to the studies which have been done on the audibility of sampling jitter in a digital system. Indeed, he makes no reference to an Audio Precision technical note written by the highly respected Julian Dunn from which Harley seems to have copied much of the technical background:

    http://www.audiophilleo.com/ja/docs/Dunn-AP-tn23.pdf

    The above paper makes reference to one study on the audibility of sampling jitter, and it is still referred to today in further studies on the subject, e.g.:

    https://cdnav.netfirms.com/shared/Jitter DBT.pdf

    What do all the academic (AES, IEEE, etc.) publications tell us? The audibility of sampling jitter depends on the jitter spectrum (tonal or noise-like), the level of jitter and whether the listener is presented with pure tones or music. I do not know of any academic study which has shown that sampling jitter levels in the picosecond range are required. Indeed, they all seem to agree that the sampling jitter is only audible in the nanosecond range (some suggest some tens of ns, others in the 100s), and is easier to hear on pure tones than music (i.e., the threshold level is a bit lower for tones compared to music).

    Academic studies have also shown that critical audio memory is very short, on the order of a few seconds (this seems to be something that a good number of Forum members have a hard time accepting). It is for that reason that all formal listening tests require both short audio samples, and rapid switching between the things being compared. In my experience, working at the research lab. of one of the consumer electronics giants, audio comparisons become extremely difficult as the difference diminishes. I had access to a floating listening room which met or exceeded all ITU requirements. When doing tests on sampling rates or identical pressings under double-blind conditions, you quickly learn how much nonsense is spouted in the audio press. You will notice that Harley, even knowing the measured differences, did not follow up with a strictly controlled DBT to see whether he could reliably identify each transport through a given DAC.

    All kinds of people make all kinds of claims, and we know from taste and comparative testing that humans are easily fooled (scientists do believe, however, that a person's taste, for example, can be swayed by the knowledge that they are tasting something 'expensive' compared to the 'cheap' alternative, even though the two things are identical). Despite people here, and elsewhere, making claims about the sonic differences between identical pressings, Prism, a number of years ago, could find no one who could reliably identify differences, which matches my own study during the SACD project (Prism published their findings, we did not when we realized that nothing concrete was coming out of the study).

    Getting back to the topic at hand, even if transmission jitter modulating the DAC master clock had been an issue for all DACs back when Harley wrote his article, it seems that even the worst performing transport would keep the jitter level below audibility! Turning to today, the jitter rejection of modern DACs is incredible. Benchmark is a popular choice on the Forum, and are well-known for their jitter-rejecting claims. So, let's look at Stereophile's measurements:

    http://www.stereophile.com/content/benchmark-dac2-hgc-da-processorheadphone-amplifier-measurements

    (see Figs. 9 to 11)

    Robert E. Greene (writing for TAS) about another Benchmark device wrote:

    The essential operating mechanism of both Benchmarks as DACs is as follows (stand by for a moderate amount of techno-babble). The jitter independence is accomplished by doing a sample-rate conversion of the output, which is clocked on a crystal-oscillator internal to the Benchmark and not correlated in timing to the timing of the input signal. The input data is processed algorithmically to the output sample rate. Thus, in effect, the sample-rate conversion calculations are necessarily treating the input as just bits. And while the clock of the converted sample could have been somehow synchronized to the input signal’s clock rate, as happens in some “upsamplers,” this is, in fact, not done. The converted sample-rate clock’s timing is set independently. So the final digital signal being converted to analog is, in fact, determined from the input bits but is not otherwise related to the input at all. Here, bits really are just bits, and jitter on the input is simply ignored. This can be checked explicitly, by introducing jitter on the input and observing the effects if any on the output. Even at very high levels of such introduced input jitter, the output jitter artifacts are far below the level of audibility—on the order of 140dB down from full level.


    This is a really vital matter for people like me, who like to use DSP processors, since the bit output of such devices is always as it should be, but jitter problems may have developed from, if nothing else, transmission of digital through multiple cable connections. Input jitter immunity is also vital for computer playback, since USB transmission is, it seems, rather jitter-prone. The Benchmark’s conversion method simply makes these non-issues.


    The electronics industry has always been full of claims of technical perfection, and one has, in view of history, to ask oneself: Is this input jitter-independence real in audible terms? Benchmark in a polite sort of way seems to challenge anyone to demonstrate that the transport matters audibly, as long as it is bit-perfect. And I must say that this is not a challenge I would bet that I could meet, after some trials. And anyone who thinks he can ought, I think, to try it blind. But before you go to the trouble you might also want to have a look at Benchmark’s test wherein it introduces jitter artificially into the input signal and check what it does to the output. “Audibly nothing” is a good way to describe what change occurs in the output signal—unless you really think that things 140dB down from signal are significantly audible.


    http://www.theabsolutesound.com/art...1-pre-digital-to-analog-converter-and-preamp/

    For a reviewer to admit this, is quite enlightening, and I have noticed John Atkinson say similar things when trying different cables with the Benchmark. What do the measurements show? That it is possible to design digital processors which can eliminate jitter to well below audible levels.

    Work has also not stopped on low jitter S/PDIF solutions, see:

    http://80.75.67.56/documents/uploads/misc/en/A_high_performance_SPDIF_receiver_Oct_2006.pdf

    (note how he refers to the same audibility paper as Dunn, but does not seem to link the jitter performance to audibility)

    Finally, on Harley, he too seems to think that RF solutions are needed for the S/PDIF signal. For me, there is a world of difference between the transmission of a few MHz baseband signal, and that same baseband signal modulating a UHF carrier, which is what you have in a video system. As the carrier frequency increases, cable quality, connector quality, etc. become increasingly important, until you hit a point where the carrier frequency is so high that you forget about cable and switch to waveguides. Are such considerations needed for S/PDIF? I contend the answer is, "No!" Why? Because many of us are using all kinds of cable, at all kinds of lengths, to transport data from sources to sinks (DACs, receivers, etc.) on a daily basis without issue.


    2. Jacko Homo

    Which brings us to the Jacko Homo text. It adds nothing concrete to the discussion. It's all, "If this..., if that...." What are the reflection coefficients for cables commonly used as digital interconnects fitted with standard RCA connectors? How does the added jitter/noise/distortion vary as a function of cable length, sampling rate, etc.? Where are the measurements, especially at the output of the DAC?

    Let's be generous and assume that the cable length induces a change in the transmission jitter level seen by a receiver, two things then need to be shown:

    1. that the change is reflected by a measured change in output of the DAC;
    2. that the changes are significant to be audible.

    We already know that any induced change in transmission jitter level is insufficient to introduce bit errors, as we would (a) hear them immediately (especially for DTS/DD signals); and (b) people have recorded the S/PDIF signal and found no difference with the ripped CD data!

    These people seem obsessed with the rise (and fall) time of the signal, which might be a concern with edge-triggered logic. However, S/PDIF uses Bi-Phase Mark coding, which can be considered a special case of Frequency Shift Keying (using frequencies F and 2F). Bi-Phase Mark is DC free, so can be sliced for decoding. Since it is known a priori that the clock rate must be based on either 44.1 kHz or 48 kHz, the receiver just needs to look for the rate of transitions in the data. Usually a logic 0 has a bit period of 2T and a logic 1 has a bit period of T, so there will be an extra transition to indicate the presence of a 1 compared to a 0. Even if the cable length affects exactly where the transitions occur (jitter), the fact that you know what the clock rates should be means that you can compensate appropriately.


    3. Agon thread

    With regard to the Agon thread, my suspicion is that there was something wrong with the mains connection to either the transport or DAC (or amplifier) which was fixed when he switched cable. I notice he does not say whether he re-tried the original cable to see if the problem returned, nor whether he tried other 1 m cables. One thing is for sure, total audio drop-outs would not be caused by the scary reflection 'problem' in 1 m cables. For that to happen, either the S/PDIF signal has to be completely interrupted/corrupted, or the power supplies in the DAC (or following amplification) were drooping due to the switching noise from the furnace/water heater turning on/off.
     
  21. Tommy SB

    Tommy SB Forum Resident

    Location:
    Santa Barbara, CA

    Thanks Black Elk for that thorough reply...
     
  22. jea48

    jea48 Forum Resident

    Location:
    Midwest, USA
    Black Elk,

    Thanks for your reply.

    There is lots to read on the subject of the spdif interface. I will be the first to admit a lot of it is over my head.

    From reading your last post, to me, you seem to give the best of ideal world examples for the CD transport, digital interface, and the DAC. How about some examples of situations in the not so perfect world?

    Example, what happens when a small ground loop voltage is thrown into the mix that will travel from a transport, through the digital cable to the DAC? What problems does the small ground loop voltage cause to the digital information as it leaves the transport through the digital cable and to the DAC?


    What effect does an overvoltage transient event sent out on the AC mains feeding a CD transport and or a DAC have on the power supply's of the equipment and the ability of the DAC to hold the digital lock and not experience a drop out?



    Black Elk said:
    3. Agon thread

    With regard to the Agon thread, my suspicion is that there was something wrong with the mains connection to either the transport or DAC (or amplifier) which was fixed when he switched cable. I notice he does not say whether he re-tried the original cable to see if the problem returned, nor whether he tried other 1 m cables. One thing is for sure, total audio drop-outs would not be caused by the scary reflection 'problem' in 1 m cables. For that to happen, either the S/PDIF signal has to be completely interrupted/corrupted, or the power supplies in the DAC (or following amplification) were drooping due to the switching noise from the furnace/water heater turning on/off.



    Two quotes from the Agon thread by jadedavid.
    DAC drops out when Furnace starts or stops. HELP | Audiogon Discussion Forum »

    DAC drops out when Furnace starts or stops. HELP

    Every time my furnace or water heater (gas, power vented) kicks on or off my DAC drops out momentarily. My audio system, furnace and water heater are all on separate dedicated ac lines. My AC panel was updated to a 200 amp service some years back.
    I checked all connections from the panel to each device and internally checked all electrical connections in the furnace and water heater, verifying proper hot, neutral and ground.
    I checked and tightened all wires in the service entrance panel.
    Has this happened to anyone else? If so what did you do to correct it?
    Any help will be appreciated.
    jadedavid



    OK, so I started by changing the digital cable (it was the easiest). I had been using an Audio Envy cable that I really like. Had it about one month. After inserting it into the system I had an occasional drop out but didn't think a lot about it. However as the cold nights came I started having more and more drop outs. Which lead me to tracking down the cause. Thus I discovered that it was the furnace and water heater making the disruption. Prior to the cold snap, it was only the turning on and off of the water heater that caused the drop out. But now with the furnace in the mix the drop out rate was increased signifigantally.
    I replaced the AE cable with a Grover Huffman digital cable and last night experienced no drop outs.
    The AE cable is only one meter while the GH cable is 1 1/2M.
    Is it the length of the cable that could be the culprit or is it the design/construction differences?
    Either way, thanks for the suggestions to get the gremlins out. I will also look into the service panel today to check the wiring as to which leg of the mains might be in common.
    jadedavid


    I sent a PM to jadedivid using the Agon message system. I said I had a few questions for him regarding his Agon thread.

    Was he still using the 1.5m Grover Huffman cable and was he experiencing any dropouts with the cable.

    Had he found any problems with the mains AC power system of his home that he had corrected since his last post to his Agon thread.

    He replied a few days later saying,
    He did not find or repair any AC mains electrical problems.
    Since his last posting on the Agon thread he had ordered a new 1m Grover Huffman digital cable. He said he has not experienced any dropouts with the new 1m Hoffman digital coax cable.

    So something is different in the construction or resistance characteristics of the Grover Hoffman digital coax cable, with RCA ends, compared to that of the 1m Audio Envy coax cable with RCA ends.

    Again the DAC was able to hold the digital info lock from the transport through the 1m Audio Envy coax cable fine until the closing or opening of an electrical contact in the furnace or hot water heater power vent.

    Line VD, voltage drop, of the AC mains power feeding the CD transport and DAC is not the cause of the dropouts. The OP said in his thread the equipment was fed from dedicated branch circuits. The main electrical service had been upgraded to a 200 amp service.
    He also said the furnace and hot water heater power vent were also fed by dedicated branch circuits.

    Without having any display data info from a power quality analyzer connected at the receptacle/s at the end of the dedicated branch circuit, feeding the CD transport and DAC, it's hard to nail down what is causing the digital dropout of the digital equipment that is fed from the electrical AC power system of the OP's home.


    Something is different though when the Audio Envy digital coax interface cable is being used. It's like the DAC digital lock is only being maintained by maybe a hair. Even the slightest poor power quality event from the AC power feeding the transport or DAC is causing the dropout.
    1s and 0s, bits is bits.

    Black Elk, what are thoughts?


    For your consideration.
    This is a quote from a white paper written by Chris Dunn and Malcolm Hawksford.
    I would appreciate your thoughts after reading the entire paper.

    Conclusions.
    Is the digital audio interface flawed? We have examined the possibilities of both amplitude and timing errors corrupting audio data transmitted across an interface. The probability of received amplitude errors is not high, and indeed they are most likely to occur in the preamble of each interface subframe. This means that if a receiver can lock onto an incoming interface signal, then the audio word values are safe. However, jitter remains a concern; several jitter mechanisms exist for the biphase-mark-encoded signal, the biggest problem being that of bandwidth limitation at any stage of the interface. We have shown that band-limited interface jitter has a strong relationship to the bit structure of the serial interface code, and hence can be highly correlated with the transmitted audio data. Measurements have confirmed jitter levels of higher than 1ns in an above-average interface circuit.
    The effects of jitter can be predicted by forming error models for different DAC architectures. It can be shown that, compared to low-oversampling multibit designs, pulse-density modulation converters are much more sensitive to jitter when producing low-frequency audio signals. This may explain certain subjective characteristics of PDM DACs that otherwise cannot be rationalized. A simple model of jitter-error audibility has shown that a DAC can tolerate white jitter noise of up to 180ps, but that even lower levels of sinusoidal jitter may be audible. These limits place tough constraints upon digital interface design, and it is recommended that interface receiver PLLs have closed-loop cutoff frequencies as low as possible. For the ultimate immunity to the effects of jitter, a second digital audio interface employed at the receiver can be used to slave the transmitter.

    Conclusion Quote:
    Bits is Bits? Page 8 »

    Entire article:
    Bits is Bits? »

    Original White Paper:
    http://www.scalatech.co.uk/papers/dunn_hawksford_1992.pdf
     
    Last edited: Feb 16, 2016
  23. Black Elk

    Black Elk Music Lover

    Location:
    Bay Area, U.S.A.
    Why bother considering them? The existence of such well-engineered products shows that digital, when properly implemented, does exactly what it should, and bits simply are bits.

    If you want to consider lesser designs, be my guest!

    Your wording still does not make clear whether he re-tried the original 1 m cable after trying the new cables. If he did, I can't explain why he only gets drop-outs with the original cable. He should contact the manufacturer to have the cable tested.

    I have met Prof. Hawksford on a number of occasions, have demonstrated DSD and his own recordings to him (on expensive reference systems), and have sat on the same panels as him at a number of AES Conventions (those being on the topic of sigma-delta modulation, which is one of his main research areas). I remember this paper and the coverage it received in Stereophile and Hi-Fi News & Record Review at the time. Since the paper is nearly 25 years old, and he may have changed his position on it, rather than do a detailed critique let me draw your attention to this part of the text:

    The same measurement made on another Bitstream D/A processor, the Audiolab 8000 DAC, reveals no jitter error at all (fig.30); this is due to the superb PLL performance of this model, offering a (claimed) closed-loop cutoff frequency of just 13Hz, where all audio-frequency interface jitter will be attenuated to inaudible levels. (Note that the noise seen at 400Hz in this diagram is due to a ground-loop problem in the test apparatus.)

    Fig. 30
    [​IMG]

    Which means 25 years ago (just about) there were DAC solutions which did exactly the same as the Benchmarks and others that completely eliminate interface jitter. Prof. Hawksford was skeptical of the jitter audibility study he referenced, but all later studies have found the same that jitter levels must be in the tens or hundreds of NANOseconds (as far as I know, and as I indicated above). You will note that there is no discussion of whether he (or others) could determine any audible differences either between the DACs, or with the application of different levels of jitter or different amounts of band-limiting.

    There is also no discussion of practical levels of band-limiting in real systems, either consumer or professional. The paper focused on S/PDIF, but with regard to AES-EBU, it states:

    Nevertheless, Cabot [4] presents an interface example in which bit errors are negligible for noise levels up to 20dB below the interface signal level with RC filtering up to 160ns, and claims to have achieved zero error-rate transmissions over an unmatched digital audio link of 100m length.

    This should be expected given the higher operating levels of AES-EBU, its more stringent electrical specification, and the fact that oodles of studios run AES-EBU over such lengths on a regular basis.

    So, I don't know why he went to all the trouble of modeling, and trying to match some practical implementations to his findings, when solutions existed (for both AES-EBU and S/PDIF) that showed no ill effects from the band-limiting/added jitter. Rather than suggest changes to auxiliary bits in the AES-EBU/S/PDIF specification to counteract this 'problem', all one has to do is design the digital front-end like that in the Audiolab DAC he measured (remember that it was measured through the 'band-limiting' Philips SAA7274) .

    The answer, then, to the question he posed ('Is the digital audio interface flawed?') must surely be, No!
     
  24. jea48

    jea48 Forum Resident

    Location:
    Midwest, USA
    Black Elk said:
    Why bother considering them? The existence of such well-engineered products shows that digital, when properly implemented, does exactly what it should, and bits simply are bits.

    Does this CD transport and DAC fit your definition of well engineered products?
    [Review] Little Dot CDP_I - DAC_I listening test [English] »


    If you want to consider lesser designs, be my guest!

    Lesser designs? 1s and 0s, bits is bits. It either works or it don't?


    ////


    Your wording still does not make clear whether he re-tried the original 1 m cable after trying the new cables. If he did, I can't explain why he only gets drop-outs with the original cable. He should contact the manufacturer to have the cable tested.


    I sent another email to jadedavid and asked if he tried the Audio Envy digital coax cable again after using the Grover Huffman digital cable. He replied yes he tried the AE cable again, and again experienced dropout problems when the furnace and hot water heater power vent would turn on. He said the 1m Grover Huffman digital cable works flawlessly, absolutely no dropouts.

    I also asked jadedavid what he was using for a CD transport and DAC. He replied he was using a Pioneer PD65 as a transport, (RCA digital out), and a Eastern Electric MiniMax Plus DAC.

    The Pioneer first came out in 1992.
    Pioneer PD-65 Manual - Stereo Compact Disc Player - HiFi Engine »

    Eastern Electric MiniMax Plus DAC.
    MMpreIntro »



    Did you forget to answer my two questions from my previous post? I would appreciate your thoughts. Please be as specific as possible with your answers.

    Quote of mine from last post:

    From reading your last post, to me, you seem to give the best of ideal world examples for the CD transport, digital interface, and the DAC. How about some examples of situations in the not so perfect world?

    Example, what happens when a small ground loop voltage is thrown into the mix that will travel from a transport, through the digital cable to the DAC? What problems does the small ground loop voltage cause to the digital information as it leaves the transport through the digital cable and to the DAC?


    What effect does an overvoltage transient event sent out on the AC mains feeding a CD transport and or a DAC have on the power supply's of the equipment and the ability of the DAC to hold the digital lock and not experience a drop out?


    Again I would appreciate your thoughts.


    /////

    I have met Prof. Hawksford on a number of occasions, have demonstrated DSD and his own recordings to him (on expensive reference systems), and have sat on the same panels as him at a number of AES Conventions (those being on the topic of sigma-delta modulation, which is one of his main research areas). I remember this paper and the coverage it received in Stereophile and Hi-Fi News & Record Review at the time. Since the paper is nearly 25 years old, and he may have changed his position on it, rather than do a detailed critique let me draw your attention to this part of the text:

    The same measurement made on another Bitstream D/A processor, the Audiolab 8000 DAC, reveals no jitter error at all (fig.30); this is due to the superb PLL performance of this model, offering a (claimed) closed-loop cutoff frequency of just 13Hz, where all audio-frequency interface jitter will be attenuated to inaudible levels. (Note that the noise seen at 400Hz in this diagram is due to a ground-loop problem in the test apparatus.)

    Fig. 30
    [​IMG]

    Which means 25 years ago (just about) there were DAC solutions which did exactly the same as the Benchmarks and others that completely eliminate interface jitter. Prof. Hawksford was skeptical of the jitter audibility study he referenced, but all later studies have found the same that jitter levels must be in the tens or hundreds of NANOseconds (as far as I know, and as I indicated above). You will note that there is no discussion of whether he (or others) could determine any audible differences either between the DACs, or with the application of different levels of jitter or different amounts of band-limiting.

    There is also no discussion of practical levels of band-limiting in real systems, either consumer or professional. The paper focused on S/PDIF, but with regard to AES-EBU, it states:


    But we have not been talking about AES/EBU test results in this thread. Our, at least mine, has always been in regards to a 75 ohm digital coax cable with RCA plugs terminated on each end of the cable.

    [/I]

    ////


    Nevertheless, Cabot [4] presents an interface example in which bit errors are negligible for noise levels up to 20dB below the interface signal level with RC filtering up to 160ns, and claims to have achieved zero error-rate transmissions over an unmatched digital audio link of 100m length.

    This should be expected given the higher operating levels of AES-EBU, its more stringent electrical specification, and the fact that oodles of studios run AES-EBU over such lengths on a regular basis.

    So, I don't know why he went to all the trouble of modeling, and trying to match some practical implementations to his findings, when solutions existed (for both AES-EBU and S/PDIF) that showed no ill effects from the band-limiting/added jitter. Rather than suggest changes to auxiliary bits in the AES-EBU/S/PDIF specification to counteract this 'problem', all one has to do is design the digital front-end like that in the Audiolab DAC he measured (remember that it was measured through the 'band-limiting' Philips SAA7274) .

    The answer, then, to the question he posed ('Is the digital audio interface flawed?') must surely be, No![/B][/QUOTE]



    Sorry, that's not the way I read the conclusion of the white paper.

    It also should be mentioned the test was not conducted using a 75 ohm digital coax cable terminated with RCA plugs. The jacks on the CD transport and DAC were not RCA jacks.

    I can't help but wonder what the test results would have shown, back then if they had?
     
    Last edited: Feb 22, 2016
  25. Black Elk

    Black Elk Music Lover

    Location:
    Bay Area, U.S.A.
    Impossible to say without measurements.


    It still requires good implementation/engineering!


    I responded to everything I felt that I wanted to comment on. I have neither the time nor inclination to address every single issue (theoretical or practical) in digital transmission systems.


    You asked me to read a 36-page technical paper entitled: Is The AES-EBU/SPDIF Digital Audio Interface Flawed?


    You are free to draw whichever conclusions you like. You asked me for my thoughts, and I gave them. Do you agree that Hawksford's own measurements of the Audiolab DAC show that it was possible even then to design SPDIF receiver circuits which are unaffected (in practice) by the problems he was trying to draw attention to?


    Where is that stated in the paper?
     
    Last edited: Feb 25, 2016
Thread Status:
Not open for further replies.

Share This Page

molar-endocrine