Let me take this opportunity for my UHD/HDR rant.
I find discussions of calibrating UHD/HDR content extremely ironic because there is no other material in commercial video that I can think of that is less subject to calibration of any type than UHD/HDR content. Why do I say this?
Lack of content
To date there are only 26 UHD discs. Superficially, this seems like a good start, arguably better than
Blu-raywhich had fewer titles available at this point in its release history. However, the number of titles is deceptive. Of these 26 titles, only 10 can be considered real UHD releases. Most titles were either filmed in 2K or used a 2K digital intermediary for mastering. The
Amazing Spiderman 2is one of the better ones--it was filmed using 35mm film and used a 4K DI.
Lack of standards
There is no real standard for calibrating HDR, at least no standard that can be applied to available commercial displays. The only clear approach is to calibrate up to the point the display clips (usually in the 70%-80% range) and then ignore everything above that. This problem is related to the next issue.
Lack of hardware support
Although there is a growing list of software titles, the hardware support for UHD/HDR is close to non-existent, at least as far as calibration is concerned. There are virtually no test pattern sources for HDR calibration, and even if there were (forgetting for a moment the lack of standards that you would calibrate to), most of the 2015 crop of displays lock the user out of the main calibration adjustments (levels, white balance, etc.). The 2016 displays may be better in this regard, but a lot of people who bought displays in 2015 thinking that they could calibrate them for UHD/HDR content are going to be really disappointed.
About the only way that most of the current "UHD" displays can be calibrated for UHD/HDR content is to use an external processor, such as a
Lumagen PRO, an effective albeit expensive alternative. However, even this option does not yet support test patterns for HDR.
Lack of 3D
I have never been much of a fan of 3D--the industry's last attempt to invent a new shiny thing to spur sales--but a lot people disagree with me. If you like 3D and want to watch it on the UHD/HDR format, then you are out of luck. It isn't supported.
Color Confusion
When
Blu-raywas released, it was clear that content should be calibrated to the existing HD standard, Rec. 709. However, UHD has been something of a mess in this regard. Initially, it was thought that UHD content would be mastered in the DCi-P3 gamut (in a Rec. 2020 container, whatever that means), but now it is clear that Rec. 2020 is the standard being used, despite the fact that no
commercial displayssupport this gamut. It is not just that displays cannot produce a gamut that is wide enough (which they can't), but even those displays that offer so-called wide gamut support, their natural gamut is generally just an oversaturated Rec. 709 instead of an undersaturated Rec. 2020. You can see this easily by looking at the green hue line. Just plot a line running from white to the green primary. The displays I have looked at run right across the Rec. 709
green primarymissing the Rec. 2020 hue line by 20 or 30 degrees.
Front projectors left out
If flat panel displays fall short of the HDR standards, front projectors are ignored entirely. Flat panels may some day reach 10,000 cd/m2 peak output. Front projectors will NEVER achieve this and shouldn't. There have always been a parallel set of standards for front projectors and direct view display. Until now, that is.
In short, for this new standard most of the released software does not meet the standard's own definition of true 4K content. Of the content that does, it cannot be calibrated both because the standard is not fully defined and even if it were the displays that support the format disable calibration controls and lack a gamut and peak output needed to display the content properly.
I understand that work is continuing along these lines and that these problems may be eventually resolved. Certainly it would seem relatively easy for manufacturers to stop locking users out of calibration controls when displaying UHD/HDR content. However, it has to be admitted that this new format release has been something of a disaster, certainly compared to what we have seen in previous format upgrades, VHS-to-DVD and DVD-to-Blu-ray come to mind (although the Blu-ray/HD-DVD format war didn't help here.) UHD/HDR was rushed to market without much thought as to how the display manufacturers would support it, how post-production houses would handle it, or how end-users might adjust it for optimal performance.Don't worry about anything above 75%. You display will not be able to reproduce it. Don't worry about an actual gamma value, because without a 100% reference, it can't be calculated.
To answer you question directly, Ryan Masciola at
rmadvancedcaldisc.comhas test patterns on a
USB stickthat should work. However, your display will not allow you to adjust calibration parameters with a UHD/HDR signal.
Your display will not support Rec. 2020, and even if it did, it won't allow you to adjust color (without the use of an external processor).