Por qué hay tanto grano en algunas de mis películas más antiguas UHD?
And yes, HDR as an EOTF can exaggerate grain but let's not forget that most of these proper 4K transfers like
Ghostbusters(GB)
are going back to the camera negatives, thereby unveiling the grain in a way that was arguably never meant to be seen in the first place. They knew full well that - negative or "show" prints and 70mm blowups aside - their images would be refracted through a copy of a copy of a copy which wasn't a lossless process in the analogue domain, losing high frequency detail with every step and ultimately affecting the grain along with it, turning it into something softer yet more coarsely defined at the same time.
With these o-neg transfers we're getting the grain resolved in a much finer but more densely concentrated form, hence the swarm-o-vision that people are experiencing, including myself to some degree before I finally nailed down my settings.
Anyhoo, after seeing this thread get resurrected I popped GB in last night to flick through a few scenes. My God, it's beautiful. Grainy as balls to be sure, it will not be to everyone's taste and can be pushed into unwatchable hideousness depending on settings and displays and whatnot, but there's so much detail and filmic texture there it's like watching a pristine negative print only with the added kick of HDR. Wow wow wow.
Por qué algunos UHD se ven oscuros?
BUT this is a title that has not been mastered to anything like as bright as conventional HDR so the more extreme the tone mapping the dimmer and darker that it will turn out to be. The irony is that it's hardly got 250 nits peak in the whole presentation so you'd think even a mere 300-nit HDR TV would be able to handle it 'as is', right? Er...no. Because it's been put into a 4000-nit container (as per usual for Warners) then that
container is all that the mapping algorithm is seeing, so instead of clipping the presentation to fit said 300-nit TV it's trying to squeeze what it
thinks is 4000 nits' worth down into it and the APL takes a hell of a nose-dive as a result.
Is it
possible? Why do you think people have been blathering on about dynamic metadata for the past 18 months? PQ does indeed code to absolute values but this is the ENTIRE crux of the issue between static display-derived metadata and dynamic content-derived metadata: all the static mastering data does is tell you the maximum, average and minimum levels of the content as a whole, NOT what those absolute values should be rendered at on a scene-by-scene basis.
Most TVs are taking that static container at face value and applying a tone map that takes none of the content into account, and the tone mapping approach itself varies from manufacturer to manufacturer. Some do indeed prefer to clip to keep the APL high, some will apply a full tone map to squeeze the range down into whatever display (apparently Dolby's own HDR10 mapper does this) which will KILL the APL on a UHD like Goodfellas, and some use a combination of both, mapping it out close to a point that the TV can handle and clipping the rest down. And the lower down the rung you go in terms of the display and the range it can handle, the more severe the mapping has to be. Conversely, the more range you've got then the mapping can be eased off.
Some TVs are rocking to their own beat, e.g. recent LG OLEDs have a faux-dynamic mode which processes the content to create its own scene-by-scene interpretation and Sony do something similar, on their higher end sets they ignore the static mastering data and clip all content to whatever level your settings are at whilst generating dynamic brightness levels. Basically with static HDR10 we're all still at the mercy of our gear and how it thinks it should map the content for us, whereas with something like Dolby Vision the display is told
how to best map the content.