HDR is Not Tied to Resolution
Whether a signal is SDR or HDR has nothing to do with either display resolution, gamut, or frame rate. These characteristics are all completely independent of one another. Most importantly:
- HDR is resolution agnostic. You can have a 1080p (HD) HDR image, or you can have a 3840 x 2160 (UHD) SDR image, or you can have a UHD HDR image. Right this moment, a display being capable of HDR doesn’t guarantee anything else about it.
- HDR is gamut agnostic as well, although the HDR displays I’ve seen so far adhere either to P3, or to whatever portion of the far wider Rec.2020 gamut they can manage. Still, there’s no reason you couldn’t master a BT.709 signal with an HDR EOTF, it’d just be kind of sad.
- You can deliver HDR in any of the standardized frame rates you care to deliver.
That said, the next generation of professional and consumer displays seems focused on the combination of UHD resolution (3840×2160) and HDR, with at least a P3 gamut. To encourage this, the HDR10 industry recommendation or “Ultra HD Premium” industry brand-name is being attached to consumer displays capable of a combination of such high-end features (more on this later). As a side note, HDR10
is not the same as Dolby Vision, although both standards use the same EOTF as defined by ST.2084 (more on this later).
Higher resolutions are not required to output HDR images. They’re just nice to have in addition.