calibrador
Borinot sense trellat
Originally Posted by Geoff D
...the DV enhancement layer can and often does include physical picture information that is not part of the HDR10 base layer and the two are COMBINED during playback which is what makes DV so processor-intensive. It also includes the dynamic metadata as well of course. And their ICtCp processing space is much better suited to HDR whereas YCbCr can run into problems with higher brightness as it was not designed for the PQ EOTF.
Originally Posted by Geoff D
Heh...the metadata for ST reads as 4000/0.005 max/min MDL (mastering display), 3995 nit MaxCLL (so the absolute brightest point in the film) and 2345 MaxFALL (brightest average scene level). In other words chief, it's way beyond 1000 nits and there are plenty more discs out there from the likes of Sony and Warners that exceed that figure.
1000 nits represents 75% of the 10,000-nit range that the Perceptual Quantiser EOTF is based around (and some in Dolby wanted it even higher at 20K nits), and 1000 really should've been the absolute minimum for what HDR TVs could reach. Anything higher than that is often going to be restricted to the brightest of bright highlights for just that extra touch of realism on top; mightn't sound like much, and I know you love your TV and don't see the need for much more - nobody does when they're wowed by what they've got - but when you see the step-up and take time to get used to it then going back to the former version is hard.
I keep banging on about how 'realistic' and 'natural' this kind of brightness looks on my TV because it does, we see nit levels in real life that greatly exceed these figures just by stepping outside the front door so I'm all for getting moar briteness in the years to come (bearing in mind that Dolby master stuff on a 4000-nit TV). As I said above, there are more discs out there with +1000 nit highlights than people might think, so as and when we start to move into +2000 nit brightness becoming more common then those discs will still have more to give.
Quote:
Originally Posted by Geoff D
No-one said stopping at 1000 nits was going to make you or anyone super sad but the extra range is there for a reason. Light in the real world (heh) doesn't stop at 1000 nits and nor does the HDR visual system as designed by Dolby. If you see it as unnecessary for your own personal tastes then that's fine but also keep in mind that with a larger range - i.e. not just how many nits your TV can map to but what it can actually produce - comes an even wider colour volume, so it's possible that there are colour nuances on these discs that are yet to be uncovered. I know losing out on that won't make you 'super sad' as you love HDR already, as do I, but when pro calibrator Vincent Teoh got a look at Sony's 10,000-nit 8K prototype he remarked at how incredible the HDR looked on it, how extraordinarily life-like it was, and it was way better than any consumer LCD or OLED that he'd seen. He's seen a few!
I don't think we'll actually get 10K nit consumer televisions any time soon as the power demands are just too great, given the various environmental guidelines drawn up by the authorities, but 4K nits might well be within reach and if we had a set that did that without compromise then that also brings up another benefit: no need for tone mapping on the vast majority of UHD content (the MaxCLL on a handful of discs actually peaks at well over 4000 nits). Small gains again then, given how well most premium sets will actually tone map at the moment, but as with a lot of this HDR stuff it's the small gains that add up into a larger whole and an impact that will be better appreciated over time.
...the DV enhancement layer can and often does include physical picture information that is not part of the HDR10 base layer and the two are COMBINED during playback which is what makes DV so processor-intensive. It also includes the dynamic metadata as well of course. And their ICtCp processing space is much better suited to HDR whereas YCbCr can run into problems with higher brightness as it was not designed for the PQ EOTF.
Originally Posted by Geoff D
Heh...the metadata for ST reads as 4000/0.005 max/min MDL (mastering display), 3995 nit MaxCLL (so the absolute brightest point in the film) and 2345 MaxFALL (brightest average scene level). In other words chief, it's way beyond 1000 nits and there are plenty more discs out there from the likes of Sony and Warners that exceed that figure.
1000 nits represents 75% of the 10,000-nit range that the Perceptual Quantiser EOTF is based around (and some in Dolby wanted it even higher at 20K nits), and 1000 really should've been the absolute minimum for what HDR TVs could reach. Anything higher than that is often going to be restricted to the brightest of bright highlights for just that extra touch of realism on top; mightn't sound like much, and I know you love your TV and don't see the need for much more - nobody does when they're wowed by what they've got - but when you see the step-up and take time to get used to it then going back to the former version is hard.
I keep banging on about how 'realistic' and 'natural' this kind of brightness looks on my TV because it does, we see nit levels in real life that greatly exceed these figures just by stepping outside the front door so I'm all for getting moar briteness in the years to come (bearing in mind that Dolby master stuff on a 4000-nit TV). As I said above, there are more discs out there with +1000 nit highlights than people might think, so as and when we start to move into +2000 nit brightness becoming more common then those discs will still have more to give.
Quote:
Originally Posted by Geoff D
No-one said stopping at 1000 nits was going to make you or anyone super sad but the extra range is there for a reason. Light in the real world (heh) doesn't stop at 1000 nits and nor does the HDR visual system as designed by Dolby. If you see it as unnecessary for your own personal tastes then that's fine but also keep in mind that with a larger range - i.e. not just how many nits your TV can map to but what it can actually produce - comes an even wider colour volume, so it's possible that there are colour nuances on these discs that are yet to be uncovered. I know losing out on that won't make you 'super sad' as you love HDR already, as do I, but when pro calibrator Vincent Teoh got a look at Sony's 10,000-nit 8K prototype he remarked at how incredible the HDR looked on it, how extraordinarily life-like it was, and it was way better than any consumer LCD or OLED that he'd seen. He's seen a few!
I don't think we'll actually get 10K nit consumer televisions any time soon as the power demands are just too great, given the various environmental guidelines drawn up by the authorities, but 4K nits might well be within reach and if we had a set that did that without compromise then that also brings up another benefit: no need for tone mapping on the vast majority of UHD content (the MaxCLL on a handful of discs actually peaks at well over 4000 nits). Small gains again then, given how well most premium sets will actually tone map at the moment, but as with a lot of this HDR stuff it's the small gains that add up into a larger whole and an impact that will be better appreciated over time.