Yes, for some reason that's how it turns out. With DV off, you can choose the combination of output bit depth and chroma, and RESET_9999 has tested SDR/HDR10 content and found that at 12-bit 422 the output is accurate and at 8-bit 444 it's slightly inaccurate (other output bit depth/chroma combinations e.g. 10-bit 420 might also be accurate, we just don't have test results for those).DaMacFunkin wrote: ↑Fri Mar 19, 2021 7:21 amSorry late to the party, received the Mininx and flashed the Ugoos firmware LATE last night, are we saying choosing TV-Led processing causes incorrect colours with HDR10 and SDR?
Why is that even a thing? TV led processing should only affect Dolby Vision, does HDR10 not pass through?
However, with TV-led DV on, for some reason SDR and HDR10 content is forcibly output at 8-bit YCbCr444, and you can't change the bit depth and chroma. HDR10 metadata still passes through, it's just that the output colour depth is 8-bit, and strangely even SDR content also has slightly incorrect colours. It doesn't really make any sense why SDR/HDR10 content is forced to be output at 8-bit YCbCr444, because when DV content is played, it changes to output 8-bit RGB anyway.
With LLDV on, all content (SDR/HDR10/DV) is locked to output at 12-bit YCbCr422. This is correct for LLDV content and (coincidentally?) SDR/HDR10 content too.
Without further information it seems like a bug or oversight, because it doesn't seem like there's any good reason to restrict the output bit depth/chroma for SDR/HDR10 content when TV-led DV is on (of course, DV content should still be restricted to 8-bit RGB). It should be easy for Ugoos to fix it. In the meantime, we can switch TV-led DV on and off to get the most accurate colours.