There's no direct comment from the band, but Everything Must Go was recorded analog, so maybe not? Cheers, Paul
Appeal to authority as an argumentative technique does not make the assertion true. If Steve Hoffman really did come to that conclusion, he was incorrect, unfortunately. Maybe his context for comparison was incomplete or his 35 year old recollection was faulty. And you seem to have an agenda in this thread to protect the sanctity of the Mastering 7 repressings -- in contravention to the abundance of factual evidence that indicates Mastering 3 is the starting point. AJ
It's not just an appeal to authority, it's clear evidence from a primary source who mastered the album. No one else on this forum who has mastered this album or anyone whomsoever has provided any evidence to the contrary that #7 is the SH master and #3 is the Roger Nichols master.
BTW, if you listen to the difference between the two masterings, it's clearly comprised of bass (stronger at ~60Hz), some hiss and "air" (above 12kHz). But there is also a difference up to the medium 1kHz-2kHz range, all of which could indicate a different (or retouched) mastering. This could be apparent to trained ears.
I've had a little maths at school. There is no way that an engineer took the tape with the digital master, A) fed it to a Sony PCM 16xx machine, B) fed the analog output to an analog equalizer, C) fed the analog output to another Sony PCM 16xx to re-digitize the signal and come up with digital data so similar to what was contained on the original tape.
Mastering 3 and Mastering 7 are digital sample aligned. That fact precludes any analog intermediate step between them. They are the same digital transfer and the same mastering. The numbers do not lie. AJ
Here a comparison between 7 and 3 as waveform in the time domain and an EQ chart in the frequency domain. Spot the differences...
No guys, a drift would only appear when doing a different analog tape transfer. It's possible to have an analog step between two digital sources and get sample-aligned digitizations if they're the same sampling rate. There was even on option on the PCM1630 to timecode it. Anyway, simply changing levels in the digital domain on a PCM16xx system like was theorized wouldn't change the bass or add hiss and noise at a -48dB level. So that's just not the explanation and that's what I was getting at.
It seems to me that sampling rate is of no consequence once the signal has left the digital domain. After some sort of analog equalizing had been done a certain analog time lag would have been introduced which would make it impossible for the second 1630 to align the samples precisely. Remember, all of this had to be done in real time. Please educate me further, dear sir.
Are you talking about phase displacement? That time lag would only happen if the quantity of processing was important (let's say like a High-Pass filter that would displace the signal by 10ms), and it would be easy to compensate for it. There is much less processing done here. I'm not saying the signal was necessarily fed through an EQ, maybe more through a console or any preamp, because simply doing a digital copy with changes in the digital domain wouldn't give such differences. Adding noise ~40dB below signal would be audible to a trained mastering engineer on a system he knows. What I'm getting at is that one of the masterings is the source while the other has been modified. They are not therefore "the same mastering".
I can never remember how to read those, but is that eq chart above showing the eq of one mastering being exactly the same looking and therefore overlaid exactly over the other; or is that chart showing one mastering’s eq difference than the other?
Yes the white and green line showing absolute values in dB on the right side scale are laying over each other. The orange dotted line shows the relative EQ differences in dB on the left side scale. Flat line in this case, meaning no difference.
It is more visible here: spectrograms of the frequency difference between the provided 30s. samples of masterings 3 & 7. First 0Hz to 22kHz --------------------------------- Then 0Hz to 400Hz Basically if it was just a level-shift or a copy in the digital domain, the images would be entirely blue. Here you can see what suspiciously resemble a ~56Hz electric hum and analog hiss, with other artifacts.
Thanks, so is this something that if you did not "level match" them (or whatever better term is used when comparing these like this) they would show an eq difference? So in that sense if you listened to one against the other without that adjustment there might be a difference? Edit: ok it seems like @Plan9 is showing differences if I am reading that right which is what I thought he was saying earlier.
Sadly it is impossible to know when doing a phase-invert difference between the two. Keep in mind the difference is low in volume, with peaks around -48dBFS.
For fun, would you show us the frequency difference spectrogram of the test I proposed in post #355: Boost a track at 16-bit resolution low quality, attenuate it with the same value and compare with the original unaltered track?
Here's a spec of the frequency difference of the Nimbus digitally boosted and attenuated with the same value at 16-bit resolution:
I did and got absolute null. I used Soundforge and not Audacity. EDIT: SF can't probably work at 16bit internally, even if the files are at that resolution. So instead I did the experiment by only boosting the file by 0.1dB without attenuating it back. I got similar-looking noise *not identical* at around -88dB. So I guess it's possible the PCM16xx would fare worse when you try to change the levels. Thanks for the experiment @strippies . It doesn't change the fact that one mastering is the source while the other was modified after the fact.