calibrador
Borinot sense trellat
Hay un artículo, debate y encuesta muy interesantes acerca de este tema en avsforums, que trata el asunto del HDR en TV y proyectores teniendo en cuenta el estado actual de la tecnología en los dispositivos de visualización de vídeo.
Para aquellos que estén pensando dar el salto a la UHD, y no tengan muy claro cómo está el tema en proyectores y televisores, puede ser de gran ayuda.
Poll: HDR UHD/4K TV vs. Front Projection for Home Cinema - AVSForum.com
http://www.avsforum.com/forum/92-co...-front-projection-uhd-4k-hdr-home-cinema.html
“4K/UHD and HDR are undeniably a killer combo when it comes to bringing cinematic image quality home. Today’s TVs deliver color and contrast that is unprecedented, and movies mastered for home viewing in high dynamic range possess peak highlights that no projector can reproduce. This begs the question: Are TVs now the ultimate way to watch a movie at home, or does projection still supply the best viewing experience?
Before taking the poll, let’s take a quick look at current state of the art in home video. On the TV side, thanks to HDR, 1000-nit highlights are a reality, as is virtually complete coverage of the DCI/P3 color gamut used in commercial cinema. The result is striking imagery that “pops” off the screen. And as a bonus, with a TV you can still enjoy optimal image quality with some ambient light in the room.
However, the catch with UHD TVs is that to get the very best contrast you have to go with OLED, a technology that is limited in terms of available screen sizesnand pricey if you go above 65 inches in size. Meanwhile, if you’re looking for a larger and more affordable LCD, you give up some performance in terms of rendering deep blacks and once you pass 88 inches, prices skyrocket.
Projection has a different set of strengths and weaknesses compared with TVs. For one, the cost of entry to 4K and HDR is higher; you’re going to have to invest a few thousand bucks to get that capability. But, you immediately have access to screen sizes TVscannot reach, regardless of price.
The catch with projection is that peak highlights are but a fraction of what even a modest HDR TV can offer, so HDR projectors are much more dependent upon tone mapping than TVs are. But on the flip side, the larger screens of front-projection home-cinema systems allow viewers to see more of the detail found in 4K/UHD content while sitting at a comfortable distance.
The discrepancy in highlight rendition between high dynamic-range TVs and projectors is quite large, but proper home theater-style projection takes place in a completely light-controlled environment, so a good HDR 4K projector operating in a completely black room can offer the “pop” you see with HDR TVs. Sure, the projector has to rely on tone mapping to a greater extent than a TV does, but the rich DCI/P3 color and deep blacks that state-of-the-art HDR projectors provide help make up for that.
I don’t believe there is a right or wrong answer to this question, but it’s still worth asking: What’s the ultimate way to watch a 4K HDR movie at home? Projection or TV?”
Click this link to vote in the poll: HDR TV vs. Front Projection for UHD/4K HDR H
Se incluye un post de Tom Huffman, desarrollador de ChromaPure, bastante ilustrativo sobre la situación actual.
Let me tell you a story.
“When HDR was first introduced and I got the formula working, I realized right away that I really don't know how to implement it. The problem is that--unlike other gamma standards that define output at every video level by reference to 100% white, which is unspecified--HDR10 specified 100% white as 10,000 nits. No display can come even close to this. The OLEDs don't even reach 1/10 of this. Projectors are doing well if they reach 1% of this! As I had done before I reached out to a few industry insiders asking how this seemingly impossible standard was supposed to be implemented in the real world. I didn't receive any answer to my queries. This was strange. I had asked questions like this in the past and always got some input, but this time nothing. The only advice I received was to calibrate the best you could and just clip signals above the display's capabilities, which in most cases is about 70% video. This is not an ideal solution, but it worked.
As time went on I begin to hear a lot about tone mapping. In short, this is an attempt to bend the PQ curve to minimize clipping. The thing is, there is no standard for tone mapping. Its implementation is left up to every vendor and manufacturer. There is another phrase for tone mapping. It is called "Making sh*t up." It then occurred to me that the reason no one had answered my earlier query is that NO ONE KNEW.
After speaking to some industry folks recently, I decided to experiment with tone mapping. Based on one suggestion, I tried a flat 2.4 power law gamma. This looked shockingly good. It is counterintuitive at first glance. I mean what is the point of high dynamic range if you are not using high dynamic range? However, if you look at what HDR is doing, almost all of the increase in dynamic range is at the high end. There is some increase at the low end, but it is fairly small. By using a power law gamma you now cover the entire luminance range of the display without clipping. Since the display is in HDR mode you can take advantage of whatever high-end luminance that the display offers, so you increase the dynamic range by 500-600%. The only downside I see is that the low end is considerably elevated from what the PQ curve suggests. For example, the PQ curve specifies 0.32 nits output @ 10% video. A 2.4 power law gamma specifies 2.39 nits @ 10% video assuming 100% video is 600 nits. I am also going to try a hybrid gamma of PQ up to 50% video and then 2.4 power law above that. Since there is no standard, I am free to try literally anything. I'll use whatever looks the best by eyeballing, and I'll offer several options to users to pick whatever they think looks best”
Para aquellos que estén pensando dar el salto a la UHD, y no tengan muy claro cómo está el tema en proyectores y televisores, puede ser de gran ayuda.
Poll: HDR UHD/4K TV vs. Front Projection for Home Cinema - AVSForum.com
http://www.avsforum.com/forum/92-co...-front-projection-uhd-4k-hdr-home-cinema.html
“4K/UHD and HDR are undeniably a killer combo when it comes to bringing cinematic image quality home. Today’s TVs deliver color and contrast that is unprecedented, and movies mastered for home viewing in high dynamic range possess peak highlights that no projector can reproduce. This begs the question: Are TVs now the ultimate way to watch a movie at home, or does projection still supply the best viewing experience?
Before taking the poll, let’s take a quick look at current state of the art in home video. On the TV side, thanks to HDR, 1000-nit highlights are a reality, as is virtually complete coverage of the DCI/P3 color gamut used in commercial cinema. The result is striking imagery that “pops” off the screen. And as a bonus, with a TV you can still enjoy optimal image quality with some ambient light in the room.
However, the catch with UHD TVs is that to get the very best contrast you have to go with OLED, a technology that is limited in terms of available screen sizesnand pricey if you go above 65 inches in size. Meanwhile, if you’re looking for a larger and more affordable LCD, you give up some performance in terms of rendering deep blacks and once you pass 88 inches, prices skyrocket.
Projection has a different set of strengths and weaknesses compared with TVs. For one, the cost of entry to 4K and HDR is higher; you’re going to have to invest a few thousand bucks to get that capability. But, you immediately have access to screen sizes TVscannot reach, regardless of price.
The catch with projection is that peak highlights are but a fraction of what even a modest HDR TV can offer, so HDR projectors are much more dependent upon tone mapping than TVs are. But on the flip side, the larger screens of front-projection home-cinema systems allow viewers to see more of the detail found in 4K/UHD content while sitting at a comfortable distance.
The discrepancy in highlight rendition between high dynamic-range TVs and projectors is quite large, but proper home theater-style projection takes place in a completely light-controlled environment, so a good HDR 4K projector operating in a completely black room can offer the “pop” you see with HDR TVs. Sure, the projector has to rely on tone mapping to a greater extent than a TV does, but the rich DCI/P3 color and deep blacks that state-of-the-art HDR projectors provide help make up for that.
I don’t believe there is a right or wrong answer to this question, but it’s still worth asking: What’s the ultimate way to watch a 4K HDR movie at home? Projection or TV?”
Click this link to vote in the poll: HDR TV vs. Front Projection for UHD/4K HDR H
Se incluye un post de Tom Huffman, desarrollador de ChromaPure, bastante ilustrativo sobre la situación actual.
Let me tell you a story.
“When HDR was first introduced and I got the formula working, I realized right away that I really don't know how to implement it. The problem is that--unlike other gamma standards that define output at every video level by reference to 100% white, which is unspecified--HDR10 specified 100% white as 10,000 nits. No display can come even close to this. The OLEDs don't even reach 1/10 of this. Projectors are doing well if they reach 1% of this! As I had done before I reached out to a few industry insiders asking how this seemingly impossible standard was supposed to be implemented in the real world. I didn't receive any answer to my queries. This was strange. I had asked questions like this in the past and always got some input, but this time nothing. The only advice I received was to calibrate the best you could and just clip signals above the display's capabilities, which in most cases is about 70% video. This is not an ideal solution, but it worked.
As time went on I begin to hear a lot about tone mapping. In short, this is an attempt to bend the PQ curve to minimize clipping. The thing is, there is no standard for tone mapping. Its implementation is left up to every vendor and manufacturer. There is another phrase for tone mapping. It is called "Making sh*t up." It then occurred to me that the reason no one had answered my earlier query is that NO ONE KNEW.
After speaking to some industry folks recently, I decided to experiment with tone mapping. Based on one suggestion, I tried a flat 2.4 power law gamma. This looked shockingly good. It is counterintuitive at first glance. I mean what is the point of high dynamic range if you are not using high dynamic range? However, if you look at what HDR is doing, almost all of the increase in dynamic range is at the high end. There is some increase at the low end, but it is fairly small. By using a power law gamma you now cover the entire luminance range of the display without clipping. Since the display is in HDR mode you can take advantage of whatever high-end luminance that the display offers, so you increase the dynamic range by 500-600%. The only downside I see is that the low end is considerably elevated from what the PQ curve suggests. For example, the PQ curve specifies 0.32 nits output @ 10% video. A 2.4 power law gamma specifies 2.39 nits @ 10% video assuming 100% video is 600 nits. I am also going to try a hybrid gamma of PQ up to 50% video and then 2.4 power law above that. Since there is no standard, I am free to try literally anything. I'll use whatever looks the best by eyeballing, and I'll offer several options to users to pick whatever they think looks best”
Última edición: