Este excelente artículo ha sido recientemente publicado (02/04/2018) por GEOFF, alguien de total confianza y que habla largo y tendido sobre el "HDR" en proyectores de todas las tecnologías.
Why you shouldn't expect great HDR from a projector
(Por qué no debes esperar un buen HDR en un proyector)
High dynamic range is the latest tech trend in the TV world, with its brighter highlights and lavish colors. It produces a more dramatic improvement in image quality than
4K resolution, especially on a high-end TV. But if you consider TVs too small and prefer the huge images produced by projectors, I have some bad news.
A bunch of
new 4K projectors are touting HDR compatibility too, but they've got issues. Limited brightness, the impossibility of
local dimming, low contrast ratios, and often, limited color reproduction, largely means that HDR on projectors won't look nearly as good as it does on TVs. It's often
barely HDR at all, and on some projectors, for example the
Optoma UHD60 and
BenQ HT2550 recently reviewed by CNET, it actually looks worse than the standard dynamic range (SDR) TV shows and movies that have been available for years.
Here's why.
Projectors have limited dynamic range:
Since HDR stands for high dynamic range, an HDR display needs, well, high dynamic range. This is easy for certain technologies, like
OLED, which has a near-infinite
contrast ratio and lots of brightness. It's also a cinch for many LCDs with
local dimming, which gives them a better contrast ratio and therefore a wider dynamic range.
It's a lot tougher for projectors.
All projectors use one of three technologies:
DLP, LCD, and LCOS. LCoS, available in projectors made by Sony and JVC, has the best contrast ratio of any of the projector techs, but even it falls far short of what OLED and local dimming LCDs can do. DLP and LCD, which together make up the bulk of the home projector market, don't have anything close to the contrast ratio of even D-ILA. Lacking the area-dimming contrast ratio enhancement of local dimming of LCD TVs, there just isn't the dynamic range available to make any use of HDR content.
That's because the light source -- a lamp, or in some projectors, a laser-- shines light on the entirety of the tiny image chip(s). The only way to dim the light is to dim the entire image. That's true even on projectors with irises.
In other words, with a projector it's not possible to dim specific pixels (as it is with OLED) or even larger areas of the image (like LCD local dimming). So a section of the image can only be as bright or dark as the image chip can make it, and in all projectors, that's far less than what flat panel TVs can do.
Casio's laser/LED/phosphor hybrid light system in a DLP projector. A traditional lamp-based DLP would have a single "light bulb" and a color wheel to supply the colors. Regardless, there's no way to adjust the brightness on a per-pixel level beyond what the DLP chip can do, only the entire image at once.
Another big issue: brightness. HDR video tells a display to create specific, actual brightness levels. The sun's reflection off a chrome bumper might be 1,000
nits, for example. No home projector has anywhere close to 1,000 nits. The HDR content is effectively a drill sergeant screaming "Jump 10 feet in the air!" at a little fluffy bunny. That bunny may have hops, but it's not getting 10 feet in the air. Well, unless you throw it. Don't throw bunnies. Or projectors.
We'll get to what happens when you try to display HDR on a projector in the "Mapping" section below.
Color:
In addition to dynamic range, HDR also allows a
wide color gamut, or WCG. This is a more solvable problem, but it's still an issue. When you increase how deep you can make colors, the amount of light you're able to produce drops. WCG means much deeper colors, which means a dimmer projector. In the projector business, light output is everything. In many cases, projectors are bought or not based on their lumen ratings (brightness) alone. That is a really bad idea in general, but that's a different topic.
Eventually we'll see new light sources for projectors, like
laser, and laser/phosphor hybrids, hit the mainstream market with acceptable brightness levels. Maybe even enough for WCG. But right now these techs are either dim, expensive or inaccurate. The projector market moves way slower than the TV market, so it might be a while before we see $1,000 projectors with laser light sources and full-gamut WCG. This is far more likely to happen than some new technology that gives projectors the contrast ratios of OLED and local-dimming LCDs, however. which isn't even on the horizon.
The Optoma UHD60 is compatible with HDR, but HDR actually looks worse on it than standard dynamic range.
Mapping:
No, I'm not talking about cartography. Matching the source material to the display's capabilities, a process called mapping, is more esoteric than color and dynamic range. But it could be the most crucial aspect to good-looking HDR on any display.
As mentioned above, HDR video works a bit differently than traditional video. To oversimplify a bit, HDR video tells a display to create a certain, actual level of brightness. Traditional video doesn't do this. Traditional video just tells your TV "create 80 percent of your max brightness." With HDR, it's "create 1,000 nits."
The problem is that most displays can't reproduce all the brightness required by the most-demanding HDR content. So they "map" the video to what they can produce. For example, the content may say "create 1,000 nits" but the TV can only do 500 nits. What does it do with the other 500 nits?
This is where it gets dicey. The TV or projector is basically guessing what works. So the image could be too dark, or more likely, the bright highlights, like details in clouds for example, will be blown out blobs.
Two projector images, side by side. Notice how there are three individual lights in the left image, but a single blob of light on the right. Neither projector can produce all the light required by the content, but the left projector is more accurately mapping what's in the content so you can see it.
Since no projector is capable of the brightness potential of even a midrange TV, this tone mapping becomes all the more crucial. If it's done right, you probably won't notice anything is happening. Done wrong, and you could lose highlights, color saturation and so on. Or to put it another way, the projector could look worseshowing HDR content than regular content.
That's what CNET reviewer David Katzmaier saw when he reviewed the Optoma and BenQ projectors mentioned above.
Although BenQ claims HDR compatibility and the HT2550 can display an HDR image, it
isn't the real thing. For Guardians of the Galaxy Vol. 2, for example, the image looked relatively washed-out and lifeless, without the punch I expected. Colors appeared less saturated, and as a whole there just wasn't as much impact to the image [as with SDR]. The Optoma's HDR looked even worse in most scenes, for what it's worth, with blown-out highlights and an even flatter look.
The good news is this is something that could largely be fixed with a firmware update. Not "fixed so it looks like HDR" fixed, but "fixed so it doesn't look worse than SDR" fixed. That doesn't mean a projector manufacturer will offer such an update, however, so it's something to keep in mind when you're reading reviews.
High-ish dynamic range:
This shouldn't turn you off getting a projector. A 100-inch image, even without HDR, is a magnificent thing. I've been using a projector as my main TV for over 15 years, and while I think HDR, especially OLED HDR, looks absolutely incredible, not once have I considered downgrading to even a 77-inch TV.
Will projectors ever get "true" HDR? Well… I won't hold my breath. None of the current technologies, DLP, LCOS or LCD, are capable of the pinpoint brightness like OLED or even the "area brightness" like local dimming flat-panel LCDs. Without one of these two things, they just can't create the dynamic range that's required and implied by the very name HDR.
Wider color gamut is a challenge too, as the deeper you make the colors, the harder it is to create brightness. Since projectors aren't all that bright to begin with, most manufacturers aren't thrilled with the idea of making a dimmer projector just for deeper colors. That's not to say it can't or won't happen, it's certainly possible. It will just be harder to find in the lower price ranges where lumen specs still seem to sell above all.
Laser and LED light sources hold some promise that light output and colors will improve, but they're still a long way away from mainstream pricing (or acceptable brightness, take your pick). And even still, the contrast ratios will still be lacking, since it's the image chips themselves that in a large degree determine the contrast ratio.
But like I said, a massive image goes a long way into making you forget about any trendy abbreviations. This is just one aspect of performance, but it's important to keep in mind HDR's mediocre performance on projectors, since it's likely said compatibility will be touted in every piece of marketing you read.
At best HDR on an HDR-compatible projector will look a little better, typically as a result of wider color -- the
Epson HC4000 is a good example. At worst, it will look at lot worse. The specs won't tell you which, though. Good thing there are
reviews.
But then, you're probably used to being a step behind in the tech and trends department. After all, it was only recently
4K projectors became readily available at reasonable prices.
Why can't home projectors do good HDR?