Discussion in 'Audio Hardware' started by love4another, Jul 6, 2012.
Is there a general consensus about this or is it still debated?
The decision should be made based on which component has the better DAC. If it is the CD player, go analogue. If it is the AVR, go digital.
Go with whichever one you like the sound of best.
I agree with both posters before me. Though I personally wouldn't expect differences in sound between optical and coax, it seems many prefer coax as the digital connection (maybe someone can explain), but I also heard that optical is more stable for longer distances.
If you have a receiver with a built-in DAC and these inputs, try all three methods and choose your favorite to listen to.
I generally have found coaxial digital cables to sound a bit better than most optical cables, though I do have glass optical cables that sound better than the plastic optical cables, and rival the coaxial.
My home office CD player claims to have such magic DACs that it has two external digital inputs (coax and optical) so you can route other digital sources through it. Analog out for me in this room.
There really isn't any difference from a performance perspective.
I prefer coax for cost and reliability reasons. Fiber has the theoretical advantage over extremely long distances because it is inherently immune to noise, however getting good fiber that long can be pricey. And SPDIF is a rather low-bandwidth digital signal, and any decent 75ohm coax should be plenty good for that up to really hundreds of feet realistically. I have never run into a coax SPDIF failure. And coax is much sturdier, the connector doesn't fall out, it won't crack or split or break like fiber, and it's easily terminated in the field by anyone.
I use both personally where for instance my Mac has toslink out and I go about 40 feet with that no problem, but in systems I've installed, I always preferred coax wherever possible, particularly for longer runs for these various reasons. I've replaced far too many shattered and crushed optical cables. Coax is the tried-and-true, and cheap.
Use both analog and digital coax to see which one sounds better to you.
Plug them into different inputs on the receiver so you can do instantaneous switching comparison.
I don't know if it's still the case nowadays, but optical were prone to microfracturing when you were bending them to plug into either source or destination, so I'd avoid that one.
Isn't optical also faster than coax?
I have always prefered optical, but what do I know?
I just posted about this in another thread, and it seems appropriate to quote myself here. I initially resisted getting into this thread as it didn't seem sensible to open up such a debate in the context of a cd player and "receiver", which probably implies insufficient system-wide resolution for any of this to really matter much...
Consider also that optical transmission subjects fragile S/PDIF to two unnecessary "media conversions", as it starts as an electrical signal at the source and finishes as an electrical signal after the receiving Toslink jack.
S/PDIF has much more in common with video signals than analogue audio... perhaps that is why audiophile cable companies appear to be completely incapable of designing an appropriate cable. Also note how heavily BNC connections are employed in commercial video applications...
To address The Hud's commend re optical being "faster", "speed" as you are thinking of it doesn't really come into the equation. In fact, due to the low quality of generally-available Toslink transmitters and receivers, coax is generally preferable for higher-"speed" (i.e. higher sampling rate) applications.
As for stability over length, you really need to take environmental effects into account. Perhaps if you're in a room with tons of interference in the appropriate band, optical might beat coaxial... but professionals tend to go AES/EBU over XLR at this point.
Is this a serious question...?
I didn't know where to even start with that one
I am always serious.
I like the optical connection for digital because, well, it kind of looks more digital.
I can't really tell if you are being sarcastic, but in case you aren't (or anyone reading is curious), we're talking large percentages of the speed of light in either medium (light on fiber, or electrical signal down copper). The time it takes for the sound to reach your ear from the speakers is billions and billions and bajillions of times slower than the signal speed in either situation. To say that the 'speed' of signal propagation between fiber and coax has no significance in this particular application is an understatement.
Signal speeds down cable can make a difference in many applications, but usually that's over extremely long distances with a single cable, or more often, when using multiple cables (or multiconductor cables) where time alignment is critical. This is not such an application: just one signal down one cable, the difference between 65% and 70% the speed of light, say, (or whatever) is beyond immaterial.
The chips involved here spit out an electrical signal, which the sending optical jack converts to light, the receiving optical jack converts back to electrical, and the receiving chip finally gets the electrical signal. To further illustrate the point, you can literally remove a Toslink optical jack from the circuit board, solider on a jack for a coax cable, and it'll work.
With my very old DAC (1990s John Westlake designed Dacmagic) I have a choice of optical and coaxial via a front panel knob.
There is definitely a difference (Westlake himself preferred the optical) - the optical sounds less etched / bright than the coaxial. I use this as a form of tone control - if the recording is a bit bright I use the optical - if it is dull sounding I go for the coaxial.
I have read that coaxial connections are susecptible to radio frequency interference (RFI) and electro-magnetic interference (EMI), creating hum. Might this type of intereference be created if a coaxial cable is used near a Wi-Fi connection for an A/V receiver?
The interference would not sound like hum, at least into an analog interface. On a digital interface this would not really be much a concern, plus if you're using decent coax it's very heavily shielded. However, it is true that optical is inherently immune to any such interference.
Optical is cool because it looks like lasers!! Pwew...pwew...pwew...pwew!!!
There's one thing that I always try to point out in these types of threads if nobody beats me to it: On many modern receivers, the analog inputs are redigitized and passed through the internal DAC if you use any type of DSP, Room Correction, Bass Management (for "small" speakers), Phase Matching, etc. What's almost always left out of the documentation, though, is the resolution/frequency of the A to D conversion.
So if you're not using your receiver's version of "Direct Mode" or "Pure Mode" or whatever they happen to call it, then the signal is more than likely going through the receiver's internal DAC anyway and any analog inputs might be going through one extra a/d d/a step. Whether that is audible or not is debatable.
Separate names with a comma.