"It turns out that Ultra HD Blu-ray support for Dolby Vision is on the way and awaits completion of a special system on a chip (SoC) to properly handle the single.(sic)"
What I wrote was, as usual,
100% correct. I said that Dolby Vision transmission does not a) require a new HDMI interface, nor b) do sources
require new encoding
hardware. Ergo,
in software.
One could just as easily decode Dolby Vision using a general purpose chip, entirely in
software. You could even send Dolby Vision HDR signals from your old Xbox One, Ps4, or PCs equipped with HDMI 1.4 if you wanted to(provided the encryption layer was approved, but that's a legal issue, not a technical one).
SoCs are systems on a chip : they are there because it is cheaper to mass produce and more efficient too, but they are not required. For UHD Bluray players, yes, it would be too expensive to add a general purpose CPU / GPU in there just for this. But I'm 100% sure such software implementations exist (because I've seen them).
So yes, what I said is true. Sending back to the source that "this TV can decode Dolby Vision" is part of the TV's hardware for handshaking, but it works perfectly fine over HDMI 1.4. What do I know, I only spoke with Dolby Engineers directly about implementing it.
The TVs, of course,
have to support Dolby Vision and
report that they implement
it back to the sources, but you can update the firmware on the source to handle new formats. This is exactly what Xbox One and Ps4 (original flavors) will do. btw firmware == software, for the purposes of semantics.
Please Read
this before you make any further comments on which versions of HDMI are necessary for Dolby Vision. Hint : if it works on HDMI 1.4, then it works on current / legacy HDMI 1.4 hardware. This is simple enough for anyone to understand, I just don't get it why anyone would still argue over it.