The key is setting the HDR/AVI Max luminance to 10,000 nits, not just the DV Data Block. This is the crucial setting that makes it not blow out on highlights, along with using the least aggressive HDR curve on your display. On the LK990 it is setting “HDR Brightness” to the -2 setting which has settings of -2, -1, 0, 1, 2 where 0 is the default for 1,000 nits.Manixx2020beyound wrote:Yes I always applied the dv meta lum values,HarperVision wrote: ↑Wed Jul 28, 2021 4:46 amAutomix with 10,000 nits?Manixx2020beyound wrote: Exactly what I use, did that as well in the past.
Bt2020 10k comes out unwatchable bright scenes are over blown.
What are watching it on?
Their is nothing to watch 10k on besides the monitor it self which does look beautiful @10k nits
Bc the capture file also becomes 10knits as well
But that does not match the 1000nit hdr version
Also tv led can’t use 10k nits, well fully I’m sure it will utilize the uhdtv max lum values
Forcing 10knit don’t look rite on the CX or any UhD tv out there.
Everyone thinks the DV Data block is the only thing that affects the tone mapping and that’s simply NOT true and I’ve proven it time and time again! It is EASILY seen and repeatable when testing scenes like The Meg chapter 8 and Aquaman chapter 6.





