8-bit is ugly. I've grown weary of cheap filmmakers using what I call "Toy Cameras" (Canon 7D, 5D, C300) for production, where the lack of bits is a huge problem with big exposure changes. More bits is a good thing.
Used to have one of those Fisher Price "pixelvision" cameras that recorded about 3 minutes of low res black and white video onto an audio cassette.
I've seen many movies made for millions of dollars that are less creative and entertaining than this.
You can kick all the ass you want, but just don't expect great pictures when the images break in color correction. Note that there are small, very light 10-bit cameras out there if you look for them. That is exactly the problem. The crappy lenses and auto-gain/auto-iris on a lot of drone cameras doesn't help.
At least 8 bit digital video is far better than old NTSC and PAL analog composite video systems. I still remember the first time I watched DVD PROPERLY plugged into a T.V.. I avoided DVD demos at local Spanish retailers as I knew the were not gonna get it right. So my first time was at home on a 25'' Sony Trinitron set and a Pioneer DVD player which I multiregioned myself. European T.V. sets of the time didn't have components video inputs but had Euro Scarts which allowed RGB video. European DVD players did have Euro Scarts with RGB output downconverted from the component video on the disc. The principle was the same for US and EU players, CRT T.V. sets in the end needed an RGB video signal, on US sets RGB was derived from component inside the T.V., on European sets RGB was derived inside the DVD player.The first movies I watched at home were Terminator 2 (the first US release by Artisan), Contact and Outbreak, all of them Region 1 NTSC discs. I was amazed at how clean and stable it looked compared to NTSC LaserDisc, and the lack of color artifacts, hanging dots, cross color, bleeding... No I see some banding on a Blu ray disc and think what a terrible format it is all the blame of 8 bit encoding. I have the Samsung UHD BD player and a SDR 4K T.V. set which strangely has a 10 bit video panel, and when I play an UHD BD disc and see some danding I think the same only this time is not the format to blame but my non-HDR T.V. set. I could change to a mid range set now but I'm waiting for Dolby Vision to become standard even on mid level sets. Then I doubt I see any video banding.
I think there's good and bad out there. I hear you on the SCART connector: we lobbied very hard at Video Review magazine in the 1980s for the EIA to come up with some kind of combo audio/video connector, but it fell on deaf ears. Eventually, the entire industry swung towards HDMI, which makes a lot of sense for most consumer stuff. (I actually much prefer HD-SDI connectors because they're a lot heavier-duty and not subject to the general flakiness of HDMI.) Back in the day, there was a lot of debate on coming out of Laserdisc players via S-VHS (Y/C) vs. composite, but the truth was a lot of Laserdisc glass mastering -- at least at the two plants I visited here in LA -- were all composite. The differences boiled down to the processing in the player vs. the processing in the TV set, so it was kind of a crap shoot. With DVD, at least you were benefitting from Digital Component encoding, and there's a lot of good with that. True, it's still 8-bit, but the noise factor and the aliasing was tamped quite a bit. I do still cringe at seeing a lot of 8-bit artifacts on stuff I've worked on where I know full well we didn't see any stepping or banding, particularly in stuff like gradiated skies or fades to or from black. There are a handful of true 10-bit displays out there and one hopes that eventually we'll have 10-bit material to watch on them. More bits is good. I'm a big advocate of More Bits, even in comparison to More K.
Scart was a pain in the you know where, they didn't lock well, they were big and a clumsy piece of plastic and metal,you swept the floor behing your furniture and the Scart was loose, I'm happy not to see them again. Regarding Laserdisc, the standard used composite video, didn't it? There were no standard to master a Laserdisc with Y/C on it. I remember some players had Super-Video output only because they stated they had digital comb filters that did a better job at separating chrominance and luminance than the built in comb filter found on most T.V.s. It was a lousy system anyway, it looked better than VHS but that is not saying much, I don't miss either of them.
SCART was a mess. The cabling could handle RGB, S-Video or composite signals, but as RGB and S-Video shared the same pins there were many compatibility issues. On a typical UK TV with multiple SCART inputs, the usual case was that only one would support RGB and composite, whilst the others only handled S-Video and composite. Getting all of your devices to output the best possible signal was often impossible without resorting to purchasing a third-party SCART switcher, and finding one of them that handled RGB properly wasn't easy. Even some devices didn't produce RGB out of the box. In the case of the Playstation and Playstaion 2 you needed to purchase an additional cable if you wanted RGB, as the default cable was simply just a three RCA jack composite and audio cable, and an RCA to SCART adapter. Then there was the case of some SCART cables not being fully wired, with only the pins required by audio and composite signals present, to confuse things further.
Yeah, whenever I brought this up to the EIA or to the electronics companies, they always shrugged and said, "SCART has been a huge mess in Europe and we don't wanna go through that again." I think HDMI has been a lot smoother, although the early ones had a lot of issues with momentary glitches and disconnects. I don't believe that expensive HDMI cables make better pictures, but I think well-made cables are worth a few extra bucks, like a $15 cable vs. a $5 cable. If you get a better connector, chances are it'll hold up better over time. BTW, note that all HD consumer video is still 8-bit, so these problems continue. The new HDR sets will include some 10-bit models, and that makes a world of difference.
What' funny is that many HDR sets still use 8 bit pannels, I guess this negates one of the HDR features which is Wide Color Gamut.