deadchip12 wrote: ↑Fri Jan 05, 2024 8:37 am
ragico wrote: ↑Thu Jan 04, 2024 11:50 pm
This Dolby Vision "affair" is becoming more and more complicated and difficult for us common people.
Yeah feels like we need a completely new thread with more organized instructions. Do we even consistently get better image with dolby vision compared to hdr10 at this point? I honestly have no idea.
If you're using an LG or Panasonic TV with a color-accurate TV-led player, the answer should always be yes, as long as the metadata used was intended for what is being played.
If you use a Sony A95K, you're better off with Dolby Vision. I'm guessing the A95L will also be in this situation once Sony fixes its Dolby Vision EOTF tracking (it currently requires Gradation Preferred for accuracy, which sounds like it uses Sony's Tone Mapping instead of Dolby Vision).
If you're using pretty much any Sony TV other than the A95K that is TV-led capable, then most have issues with their Dolby Vision base configuration data. Unless you can measure your specific panel or see measurements from someone else who has measured, you may be better off using HDR10 with Sony's Gradataion Preferred tone mapping.
If you're using any Sony TV that is only Player-led capable, it also depends on your playback device (many playback devices aren't up to the task from RESET's research). If your Sony TV is only Player-led capable, it likely has less luminance and color volume and can significantly benefit from Dolby Vision's tone mapping. Still, I'd measure it regardless since Sony has a history of striking out with Dolby Vision.
If you're using any other TV brand, there isn't enough data to determine what's best. I'm unaware of anyone doing deep analysis on Hisense, Vizio, TCL, and Philips TVs since most enthusiasts are on LG, Sony, and Panasonic TVs.