adding "grain" back into Hi Def Video

Discussion in 'Visual Arts' started by b&w, Jul 11, 2006.

Thread Status:
Not open for further replies.
  1. b&w

    b&w Forum Resident Thread Starter

    this is an interesting bit of information-

    http://www.videobusiness.com/article/CA6350773.html

    SMPTE approves Thomson technology
    Film Grain process part of HD DVD specs but not Blu-ray
    By Paul Sweeting 7/7/2006
    JULY 7 | Technicolor-parent Thomson has been given a stamp of approval for a technology to make high-definition video look more like conventional film during playback.

    The Society of Motion Picture and Television Engineers last week formally adopted the technology by publishing the system’s specs on its Web site as a “registered disclosure document,” the engineering equivalent of the Good Housekeeping Seal of Approval.

    The process, called Film Grain Technology, is already in use in HD DVD players and playback software. But the endorsement by SMPTE paves the way for broader adoption as high-def video becomes more entrenched both in the movie production process and home entertainment formats.

    “Thanks to our relationship with Technicolor, we have access to a lot of people in Hollywood, including many of the top ‘golden eyes’ at the studios,” the head of Thomson’s Corporate Research Center, Jeff Cooper, said. “We were very happy to be able to fool several of the top people in the field, who couldn’t tell the difference between original film and the video using our technique.”

    Typically, the “grain” that naturally occurs in film emulsions is removed in transferring a film to video, giving video a somewhat “flat” or artificial look compared with film.

    Thomson’s Film Grain Technology inserts a simulated grain pattern into a video signal before display to restore some of the richness and texture lost during the transfer.

    “With standard-definition and ordinary TV displays, you can’t really resolve grain, so there was no point in trying to keep it,” Cooper said. “But with HDTV sets, you have a device that can preserve and display the original grain if you can get it there. Although it’s still a problem to get the original grain to the display, we can preserve the look of the original so what you’re seeing looks more like a projected film image.”

    Unlike actors and other solid objects in a film scene, the grain pattern in film is random from one frame to another. That makes its difficult to encode using standard compression schemes, which rely on predicting the content of one frame as much as possible from the frame before.

    Without such predictability, the grain pattern for each frame would have to be encoded separately, which would consume much of the bandwidth or storage capacity available with current video formats, even given the increased capacity of HD DVD and Blu-ray Disc.

    As a result, grain is usually removed to leave bits available for image detail.

    Although grain is still removed in the Thomson process, FGT developers used mathematical models of particular film grains to create simulated grain patterns.

    The simulated pattern is then pumped back into the video just before it’s displayed.

    “People have been modeling film grain and developing simulations for years, so there was a large body of knowledge to draw on there,” Cooper said. “The key was to come up with ways of modeling that could accommodate all, or at least many, of the variety of film stocks that directors use to achieve the effect they want, and then make it simple enough to implement in consumer devices.”

    With SMPTE approval, the FGT also is likely now to find a role in the production process as well, Cooper said. “A lot of movies today are a combination of traditional film and CGI, which does not contain grain. This is a convenient way to give the entire film a consistent look.

    Thomson proposed the technology to the DVD Forum in early 2005, which adopted it as a mandatory feature of the HD DVD specifications.

    The technology also was offered to the Blu-ray Disc Assn. but was not adopted for that format.

    Cooper declined to comment on the reasons behind the Blu-ray group’s decision.
     
  2. Scott Wheeler

    Scott Wheeler Forum Resident

    Location:
    ---------------
    dither for HDTV
     
  3. JoelDF

    JoelDF Senior Member

    Location:
    Prairieville, LA
    Next, they'll offer a "hiss generator" for noise reduced hi-res audio formats :)
     
  4. Ken_McAlinden

    Ken_McAlinden MichiGort Staff

    Location:
    Livonia, MI
    ...and in the meantime, film companies will continue to develop super-fine grain structure emulsions to make visible film grain undetectabe.

    In any case, it's just one more digital tool in the box for cinematographers trying to create a specific look.

    Regards,
     
  5. Scott Wheeler

    Scott Wheeler Forum Resident

    Location:
    ---------------
    They need all th help they can get. Most Directors of photography have had these formats forced upon them for economic reasons. It sucks.
     
  6. Flatlander

    Flatlander Forum Resident

    Location:
    Indy
    Wait a minute!

    Isn't this like adding a "Rice Crispies" track or layer for digital audio playback?
     
  7. BrianH

    BrianH Formerly healyb

    Location:
    usa
    Exactly what I was thinking!
    Someone tell them NO!
     
  8. mandel

    mandel New Member

    Location:
    London, UK
    WTF?!

    The HDTV samples I've viewed on my computer look amazing BECAUSE there is no film grain, there is no veil between the light going into the camera and what is displayed on the screen. Why on earth would anybody want to make HDTV look like crappy cinema film?! Lets make the image jump every 20 minutes too so it looks like we have reel changes :sigh:

    Is people's resistance to anything new really that extreme?!
     
  9. Scott Wheeler

    Scott Wheeler Forum Resident

    Location:
    ---------------
    No it's not. Film grain is invisible with todays films.
     
  10. Scott Wheeler

    Scott Wheeler Forum Resident

    Location:
    ---------------

    On your computer? What is your screen size? How does it compare to a large movie screen? Film is vastly superior to HDTV.


    Yes when it sucks as bad as HDTV does compared to film.
     
  11. Dan C

    Dan C Forum Fotographer

    Location:
    The West
    IMHO 'sucks' is too strong a word. More like 'different'. Digital HiDef has come a long way and has a way to go, but it can compete visually with film. Of course they'll never look the same, nor should they. Personally I think it's ludicrous to degrade a clean HD image. If there's grain in the original print, LEAVE IT. If there's no grain, then so be it.

    dan c
     
  12. Flatlander

    Flatlander Forum Resident

    Location:
    Indy
    ... and all wire sounds the same, too. ;)

    Sorry for the big jump, but that's what your statement reminds me of. I'm drawing silly parallels to satirize you absolutism. I haven't been to a theater in 2 years, but I could see plenty of film grain then and I always did in certain areas. Graininess in filming depends on many factors. It's also more visible to those with trained eyes.

    Digi is definitely different looking ... not arguing that point. Adding a filmlike grain to the other digital signatures is kind of like some prefering tubes and their "elegantly poor" specs. I don't really see a problem with the idea, although it's not a purist approach. Tell us more.
     
  13. Scott Wheeler

    Scott Wheeler Forum Resident

    Location:
    ---------------

    Not too strong a word for me. i think I am being nice. HDTV is just plain ugly



    I agree it has a long way to go but it can only go so far. The format sets the maximum resolution way below that of film.




    I degrade it because it deserves it. What should I do look at it, hate it and shut up about it? HDTV has no "grain" but it certainly has less resolution. Digital artifacts, and video artifacts are to my eyes far more offensive than film artifacts. That is if you can actually see the film artifacts. The dynamic range of film is excellent and it's ability to capture fine gradations in contrast is simply unequaled.
     
  14. Scott Wheeler

    Scott Wheeler Forum Resident

    Location:
    ---------------

    It's not absolutism. There comes a point where grain becomes invisible. Todays best film stocks are there.





    Not to get into a pissing match but my eyes are well trained.



    That's not the point of adding grain at all. Like I said it is like dither. It will help remedy certain intrinsic problems with HDTV.



    It's not a matter of purist or non purist aproaches. It is about fixing inherent problems with the format.
     
  15. b&w

    b&w Forum Resident Thread Starter

    The article isn't taking about intrinsic problems with HDTV, if today's best film stocks have grain, how crappy you think HDTV looks, etc. If you want to talk about the article then address what the article is talking about. If you have some other issue with HDTV then start another thread.

    So what the articles says is-
    Typically, the “grain” that naturally occurs in film emulsions is removed in transferring a film to video, giving video a somewhat “flat” or artificial look compared with film

    And why do they remove the grain-
    As a result, grain is usually removed to leave bits available for image detail.

    How are they going to correct the fact the remove grain during the transfer-
    Although grain is still removed in the Thomson process, FGT developers used mathematical models of particular film grains to create simulated grain patterns.
    The simulated pattern is then pumped back into the video just before it’s displayed.

    Why is this important to do now with HDTV-
    “With standard-definition and ordinary TV displays, you can’t really resolve grain, so there was no point in trying to keep it,” Cooper said. “But with HDTV sets, you have a device that can preserve and display the original grain if you can get it there. Although it’s still a problem to get the original grain to the display, we can preserve the look of the original so what you’re seeing looks more like a projected film image.”

    Anyone with more then a passing interest in the cinematography of movies knows that there are often different levels of grain in the film print that is intended to be there for whatever reason by the director or/and the DP. So how will this system add different levels of "grain"? How will it match the grain to what was originally intended? And many other questions along that train of thought. I'ld like to see some clips with the system activated and turned off on the exact same transfer.
     
  16. BrianH

    BrianH Formerly healyb

    Location:
    usa
    Sounds like a BAD idea.
     
  17. Flatlander

    Flatlander Forum Resident

    Location:
    Indy
    My first post was a legitimate question, although worded a little comically.;)

    Even after Scott's detailed assistance, I still don't quite understand how it will help to nix the grain to save some bandwidth and replace it with a simulated digigrain artifact.

    Maybe you can offer some additional insight for those of us not in the loop.
     
  18. Dragun

    Dragun Forum Resident

    Location:
    Los Angeles, CA
    My guess is we'll get digital noise reduction artifacts (smearing, "holes" in the picture) combined with this new grain on at least a handful of releases.
     
  19. Scott Wheeler

    Scott Wheeler Forum Resident

    Location:
    ---------------


    Certainly it is. The inability to resolve the grain is an intrinsic problem. If it could resolve the grain there would be no reason to try to compensate for it by adding a simulated grain back into it.





    I am. HDTVs inability to reproduce film and a new technology that helps compensate for that inability.





    No, my only issue with it is how it looks. This article is about a means of improving it.





    IOW there ismply isn't enough resolution and the end result is a simplified and stark looking image that does a poor job of rendering gradations in value






    So what exactly is you beef? What you just quoted pretty much is exactly what I am talking about.



    You are missing the point. The point isn't to see the grain. The point is the randomness of the grain makes the image much better. That same randomness makes it very difficult to digitize without running out of memory. That lack of memory and the methods used to compensate for it cause specific ugly artifacts in HDTV. Trying to add back the grain is a means of fixing those ugly artifacts. It's not so the image can look grainy. As you say "anyone with more than a passing interest in cinematography" would be aware of these shortcomings and why adding grain would help remedy them.



     
  20. Scott Wheeler

    Scott Wheeler Forum Resident

    Location:
    ---------------

    The effect of grain as opposed to pixels or video lines is improved perceived resolution because the grain is random. The randomness of that grain prevents you from locking into it visually. It's all moving around at 24 frames per second. With HDTV the pixels are stationary and there are methods of compressing used memory by interpreting those pixels as close enough to the same from frame to frame that they will be rendered exactly the same. It leads to ugly artifacts. the reason you can't record the film grain is it defeats these memory saving methods and soaks up all the memory. the fake grain added back is encoded so as to take up far less memory than actual film grain.
     
  21. Grant

    Grant Life is a rock, but the radio rolled me!

    Uh...I have actually added light white noise and tape hiss to NR'ed files or CDs to help bring back some naturalness to the music. It has fooled several people.
     
  22. JoelDF

    JoelDF Senior Member

    Location:
    Prairieville, LA
    Somehow I'm not surprised that you tried it :)

    Although, I know that the analogy I made there is only somewhat comparable. But, there was always the running argument about removing grain from the classic Disney animated films "because it wasn't in the cel paintings they made." Sound familiar?

    I know that Scott's replied already again, but the bandwidth they are talking about is at the compression stage. The "noisier" the image is, the harder it is to compress without things getting blotchy. You can increase the bitrate to do less compression on it, but that creates a bigger file and uses more bandwidth to process such an image. So the trick is to smooth (or flatten) out the image first by running it through some filters, then run it through the digital image compression algorithms which now has an easier time and can make smoother color gradients with less file size and bandwidth. The artificial grain is added in after the image has already been "decompressed" (and bitrate and bandwidth issues already delt with) for display on the screen.

    But, I agree with b&w. This can't be an intrinsic problem with the HDTV format in and of itself - otherwise the "after-the-fact" grain addition still wouldn't work - it still has to be displayed on an HDTV screen.

    It is a problem with the delivery format chosen to store HD content. A high-def source does not necessarily have to be compressed - thus, the grain issue does not really have to be an issue at all. However, content providers are all obsessively driven to have any kind of digital content stored on these really small discs whose size was chosen some 25-odd years ago to store only 74 minutes of stereo audio in an uncompressed medium resolution digital format. All because consumers as a whole seem to be enamored with those shiny things. That's why everthing has to be compressed in some form or another. It would be a great day when someone invents a method of storing HD content on a portable consumer medium in a completely uncompressed format.
     
Thread Status:
Not open for further replies.

Share This Page

molar-endocrine