Discussion in 'Music Corner' started by Steve Hoffman, Apr 2, 2007.
Thanks Steve - that's a good article.
That has nothing to do with compression. Sound pressure is inversely proportional to distance. So while moving from 12" to 2" will result in a 6x change in pressure, moving from 120" to 110" will result in a 1.09x change in pressure.
Ironically, one of the main functions of a compressor is to actually prevent clipping.
Without compression it would be very hard to record some sources without clipping. There are many many ways to clip a signal in the studio. Every engineer has to deal with what we call gain structure. There is usually a fairly narrow range for setting optimal levels. If the levels are too low there may be too much noise. If you set the level too high there may be clipping. Gain structure has to be dealt with all the way from the microphone to the recorder and there can be many places along the way were clipping can occur. If the mic signal is too high, a pad (attenuator) may have to be used before the signal even hits the console or the front end of the mic preamp can clip or the mic transformer can saturate. Of course moving the mic back a bit can help with this. And of course each type of signal has it's own characteristics.
For example, a triangle can have peaks way above it's average value. If you recorded a triangle and hit 0db on the vu meters you're likely going to have problems. Using peak meters instead of vu meters will shed some light on to this. Maintianing proper gain structure on the fly in a fast moving session can be a real challenge.
Doing live PA one can get away with a little clip here and there, but when the mix is recorded for professional release the clip will be there for all to hear. I can't tell you how often I hear clipping on big time commercial recordings. Sometimes the take with the clip is deemed so good that the clip is accepted. Other times it passes because it wasn't noticed.
These clips are usually on individual tracks and usually have little to do with the overall dynamic range of the recordings. If the overall mix clips that is a whole different situation. In other words, if you looked at the waveform of the individual clipped signal it will look like a clip, but if one looks at the overall waveform of the complete mix the clip may not be visible to the untrained.
Something I've Been Wondering...
Why the objection in this forum to CDs being mastered too loud? If they're not clipped or distorted (crucial to my query), what is it about this loud-ness* that makes it fatiguing or bad? Shouldn't mastering as loudly as possible take advantage of a wide dynamic "height", and use more "bits" for each sound, therefore less digital "grain"?
*Not to be confused with the "loudness" switch on 70's receivers/amps (Fletcher/Munson effect, etc.).
Yes. That's true for non-limited mastering (which is extremely rare now).
On the other hand, if you make the level too hot with a peak limiter the dynamic range and resolution are reduced even though you're at full digital level.
In addition to compressors and limiters, all equipment responds non-linearly over a wide dynamic range. Tube circuits and magnetic tape will provide a degree of "soft" compression/limiting as they approach saturation. Solid state components remain fairly linear up to the point of hard clipping. Exceeding the max input voltage on an A/D converter has very nasty results. This is one reason why tubes and tape aren't going away anytime soon.
That's a good way to compress something without a compressor. Analog tape is the best compressor ever made.
Although in theory you're right, think about it for a minute. If someone stood next to you and yelled at all times, you'd find it pretty annoying. Or else, ever listen to the test pattern on a TV station? That constant square wave is really annoying. That's what's happening with a lot of CDs that have been maxed out. Everything is crammed up in a narrow range near digital zero and as a result, the sound is congested, relentless, and very fatiguing. Some people on a quick listen think they're hearing more "detail" so therefore it must be better, but all they're hearing is the detail artificially boosted to the point where it's all the same.
Unless you distort the original signal, mastering as loudly as possible means setting the level so that the loudest peak is still withing the linear range. Which means that quiet passages will still be quiet. Modern "loud" recordings use limiting, which by definition is distortion.
That's what I'm referring to. So really, the "loud" CDs that folk complain about are those as in Steve Thomson's post - it's not the actual volume, it's the limited/squashed dynamic range. After all I can turn down a CD that's merely loud; that's what confused me.
We must remember that mic pre amps have a limited dynamic range.
Recorders can now cope with 120dbs+ range
Mics can cope when they have good p48 and small condenser diaphrams.
Mic amps have a harder time.
Limiters are engaged in recording to prevent clipping, this may never happen
if the channel is properly modulated.
Compressors come further down the chain to ease dynamic intelligibility.
Limiters and compressors like children, should be seen ,but not heard.
Unlikely but it can happen.
That is the famous 'soft clipping,' correct?
Separate names with a comma.