When CDs were first introduced back in the 1980s, why were they only 16-bit/44.1khz? Why couldn't they be 20-bit/44.1khz or 24-bit/48khz? I'd rather have a 60 minute CD in 24-bit/48khz than an 80 minute CD in 16-bit/44.1khz. Did all the sound engineers agree that 16-bit/44.1khz is sufficient enough to capture the audio from analog master tapes??? Wouldn't that prove that so called "hi-resolution" music today is an over kill and not necessary or not discernible to the human ears?
I don't think they had higher bit depths to choose from. It has been said that an executive in charge of something or other really like a particular Beethoven symphony (the 9th?) and they made the sample rate so that CDs would be long enough to hold that without a break. Probly a more technical reason thrown in there too.
Phillips only wanted to use 14 bits! It was a matter of what technology could deliver at that time. The sample rate is due to nyquest, basically it has to be a little bit higher than twice what you want. 44khz allowed for 20-20,000 hertz. When CD players were first introduced (1982 in Japan, 1983 in the US), most home and business pc's were only 16 bit computers. So there you have it.
Yes, that is correct on the Beethovens 9th, it was recommended by a conductor who was asked for input. 74 minuets was what was needed to hold the entire 9th symphony. 16 bit by 44khz sample rate was cutting edge in 1982.
Two of my great uncles did experimental work on digital audio in the mid-to-late 70s for RCA. They felt 8 bits by 44 kHz was "roughly equivalent" to the best mastered vinyl records of the day.
One tidbit of trivia, when the first CD player by Sony was demoed at its first public showing, it required more hardware than actually could fit in the chassis. They were were still developing the decoder chip and laser electronics that would fit inside. So to get it press time, the additional hardware needed was hid under the table that CD player was sitting on.
Jeepers! Well I bow to your uncles superior knowledge to mine, but that sounds like a horrendously rough equivalent I would say.
The 8 bit word length goes back to Bell Labs when they were developing digital phone lines (first one in service was in 1962 between St Louis and Chicago). Bell Labs thought for voice 8 bits by 8,000 sample rate was adequate for voice telephony. The reason digital telephone service was developed is because they could use as many repeaters as needed to get the call to its destination with no degradation. Analog amplifiers used as repeaters added noise at each repeater station.
They knew what they were doing. They started at RCA in 1946 and worked until the mandatory retirement age. Television, then color television, then digital computers. By today's standards, yes, 8 bit would have been horrendous. In, say, 1974? Probably didn't seem too bad. 16 bits was the bleeding edge of the technology in 1982. They recalled hearing a very early digital audio demo (at Bell Labs perhaps?) It utilized a bit depth of 4 bits! They claimed it sounded alright. Not great, but certainly intelligible.
4 bit was first demoed by Bell Labs in around 1948, and it was with tube gear as switches. Even 4 bits would have required a ton of tubes biased to switching, most likely a sample rate around 8000 as well. I have heard 4 bit recordings and while intellageble, it was not very good.
Do not forget Denon was recording stereo music in the early 70's, not for direct consumer use but in the recording studio. If I remember correctly a few famous recording artist used it for a short time. Telarc records also had digital recordings in the late 70's, they were sampled at 50,000 hertz sample rate. Ry Cooder had released the first "digital" rock record in 1979 or early 1980. The Lp was vinyl, just digital in the studio.
Well, a big driving factor for 16/44kHz was the equipment used to record digital audio. Some custom early digital recorders - like the Soundstream - worked at 50kHz or other odd sample rates. 3M's hideously expensive digital multitrack recorder also worked at 50kHz if memory serves (it was used on Christopher Cross and Fagen's Nightfly, among others), and was so early to market that 16-bit A/D converters didn't really exist - 3M used a 12-bit and an 8-bit converter chained together, with the extra 4 bits used for error correction if memory serves. But a lot of these devices relied on really expensive open-reel data recorders - any standard based around them would be hugely expensive for studios to deploy, and limited only to the most well-funded operations. Sony just happened to be a massive designer and manufacturer of a cartridge-based tape recording device - the VCR - that also had sufficient bandwidth to record digital audio. Both the large 3/4" professional UMatic recorders as well as Sony's then-new consumer Betamax format could record 16-bit, 44kHz digital audio onto videotape, forgoing the need for hideously expensive open reel data recorders and tape and making the tapes theoretically far more robust, compact and accessible. Instead of a finicky, semi-custom $10,000+ recorder and $50+ tapes, you were looking at a robust, interchangeable $1,000 VCR and $15 tapes. The PCM-1 from Sony is probably the gadget that first brought the potential of digital recording to wider notice: Sony PCM-1 on thevintageknob.org . But it was its successor - the 16-bit PCM-F1, also built around VCR recorders - that first gained widespread adoption: In Praise of the Sony PCM-F1 . It was probably Sony settling on 16/44.1 that made it the standard going forward. This was the start of their ascent to the pinnacle of the consumer electronics space at the end of the '70s, with the arrival of the Walkman, digital audio, the dominance of their Trinitron picture tube and their peerless industrial design. They had a few flops along the way, but I don't think a single other firm has done as much to advance consumer electronics on as many fronts at once in my lifetime as Sony did.
I can understand that 16-bit/44.1khz was the best available back then, but eventually with DAT 20-bit and 24-bit at 48khz was introduced. Why didn't they increase the resolution of CDs to 24-bit/48khz. Imagine what they would have sounded like when properly mastered. 16-bit/44.1khz CDs should have stayed in the 1980s and higher resolution CDs should have dominated the market afterwards. Even today, there is no reason to not have 24-bit/48khz CDs available.
Because they wouldn't have been compatible with existing players, that's why. Also, it was a few years after the introduction of CD before shorter-wavelength solid state lasers were available and affordable, which would have been required to make a higher-capacity disc capable of holding 24/48 audio. Sony eventually did come up with the backwards-compatible SACD format, using separate layers, one with traditional 16/44 CD audio, another with DSD that was invisible to old CD players. But that technological trick didn't exist in the early '80s and wouldn't have been remotely affordable until at least the early '90s. SACD didn't hit the market 'till 1999. Arguably Sony could have got something like SACD out the door earlier (circa 1995) if they'd used limited high-resolution (24/96) PCM and foregone multichannel support. But I don't know if that hypothetical format would have even attracted as much attention as SACD ultimately did - multichannel was actually a pretty strong selling point for both SACD and DVD-A.
Along those lines, I recall hearing from coworkers or a rep after the CES where Blu Ray debuted that they had to refrigerate the player at CES just to get it to operate*. Amazing how quickly things progress. *Since this is just hearsay, I make no claim to authenticity but it's certainly within the realm of possibility.
They didn't increase CD resolution because 1) almost everyone was more than happy with CD as it was, 3) there was near zero demand, 3) it would have been hugely expensive and disruptive. No demand + no return on investment = no change. In fact, apart from a niche market today for hi-Rez, the trend has been towards lower resolutions.
Also recall that DVD-Video which Sony had a hand in developing is capable of stereo 96/24 uncompressed PCM. I have a few of these high resolution discs and they sound great. But FWIW I think 44.1/16 sounds completely awesome and adequate for extremely high fidelity as it is when mastered with care.
Yeah, I remain unconvinced that "high-res" is all that useful even today, especially when it's used to present awful, overcompressed, badly-equalized masters. DVD itself is capable of linear 24/96 PCM, as noted, but that also didn't roll out until 1995, and obviously the discs weren't backwards compatible with CD players.
I think it’s very easy to get carried away with thoughts about sample rates and bit depths when talking PCM. Whilst production workflow benefits from greater bit depth, more so than higher sample rates, consumption at 16/44.1 can be highly immersive and enjoyable. Especially when said work is done with taste and attention to detail.
Philips had developed the optical video disk in the late 70's. The problem Philips had was that at least he time, they usd Constant Angular Velocity or CAV. Like an LP playing, the disc rotated at a constant speed. Sony developed and added Constant Linear Velocity or CLV. CLV increased the amount of data on the disc, instead of rotating at a constant speed, the angular velocity was slower at the inner part of the disc. The differences was that a CAV disc different pit lengths, so that the reflected data info was in constant time. CLV allowed for the disc mechanism to run slower at the inner part of the disc and faster at the outer edge
I agree. It might have killed off the CD as well. If only a few years after they introduced expensive players and discs (remember how expensive CDs were back in the beginning?) they said, "Forget that, here is something better." People probably would thought that they'd better wait and see what was going to happen before they started buying. Back when SACDs came out people had just finished converting their album collections to CDs and thought that getting people to buy new players and discs was just a money grab. Chad
Philips wanted 14 bit but Sony persisted on 16 bit for CD (actually it is 18 bit but these two bits are for CD text messages as Artist and song titles). It all became a standard known as red book spec.
It's amazing to me they were able to get CD working when they did. It really pushed the limits of available technology. Some people like to slam CDs, but IMO we were very lucky to get it. It blew the market for recorded music wide open.