It only appears to brighten the image because it is now properly tone mapping the output to the display’s reported capabilities such as peak luminance, instead of sending 1,000 nit content to a display with less than that luminance available while using a static PQ curve, making the luminance levels fall in the incorrect place, making the scene darker than it should be.RESET_9999 wrote:Fake DV vs Real HDR10: https://screenshotcomparison.com/comparison/12710
Real DV (p5 web) vs Real HDR10: https://screenshotcomparison.com/comparison/12712
As you can see, it just brightens the image. I wouldn't recommend using the x700 HDR10/SDR to DV on the fly conversion.
In other words, what’s supposed to be at say digital 10 bit code 150 is now at that proper level, whereas before it was artificially being crushed and lowered to all 1,000 nits into say a 200 nit static container.
This is incorrect if the display maps the colors and dynamic range as it should. If it doesn’t that’s on the display you’re using, not the player.azreil24 wrote:All Sony players convert everything to DV if you have the DV option activated. It sounds cool, but it isn't as it sends SDR content in DV container and messes up the colors and everything. At least that's how it works with discs and the reason I sold it, as I was tired of having to manually activate DV based on movie.
Colors within any of the color gamuts will be completely identical whether they be Rec709, DCI-P3 or BT2020, up to each of their limits of course. So putting a Rec709 source into a wider P3 or BT2020 container should look and display identically when the gamuts are mapped properly since Rec709 is smaller than either of those wider gamuts. It’s not supposed to stretch the colors at all and if it does then your display is doing it wrong, not the source.
I have both an X700 and an X800M2 as well as an AppleTV 4K and they don’t have any of the issues reported with my displays because they map the signal correctly.