mulucy wrote: ↑Tue Aug 03, 2021 7:14 pm
Sorry for a silly question, but I see "RGB 8bit DV " and 12-bit something referred a lot here with TV-Led and LLDV, would you mind telling the difference, please?
LLDV / player-led : it's the player doing the tonemapping which is buggy. It's sending a decoded 422 12bits signal (decoded so thats why we can capture it)
RGB 8bit tunneling/ tv-led: it's the TV doing the tonemapping which is accurate. it's still sending 422 12bits but wrapped in an 8bit RGB container. ( @manix thinks he found how to capture it but i don't believe him)
don't worry about that. there is no quality scaled-down, the x700 doesnt have the power to do that. Many users here and I are using this player for DV and we never had any playback issues for any movies. (except for the known green screen bug with TS files)
HarperVision wrote: ↑Wed Aug 04, 2021 7:01 pm
I believe when you are playing ripped files with the Sonys and not discs, it actually does auto switch DV and HDR10. At least that’s been my experience with my X800M2.
With discs you have to manually switch though, as you’re saying.
You're almost right. The x700 will auto switch to DV and HDR10 if your HDR10 container is MKV but the problem is that the x700 only supports Atmos in a ts/m2ts/bluray container. (because mkvtoolnix splits the ac3 core)
Any TS/M2TS/bluray (HDR10 /SDR) will then get converted to DV. Netflix will also output everything in DV if you don't turn off DV in the settings.
It's really not an issue for me as I only use it for DV: I use my HTPC for HDR10/SDR.
The x700 will go even further and convert DV (even FEL) to HDR10 if you connect an HDR10 only display and leave DV turned on. Not only for file playback but on Netflix too. But I don't recommend doing that because it raises the black floor and you get incorrect metadata from netflix.
HarperVision wrote: ↑Wed Aug 04, 2021 7:01 pm
I’m wondering if maybe since LLDV was built for and based on Sony HDR displays that didn’t have enough horsepower to decode full RGB 8 bit TV Led DV, that LLDV is supposed to be at Full range, and not Limited Range like normal HDR10 YCbCr 4:2:2 10 bit? Normally it is changed from Full to Limited in the DV decoding and processing before it’s presented to the display panel.
I doubt they ever intended anyone like me to discover the trick with an HDFury to get sources like an AppleTV and various UHD Bluray players to exploit using LLDV for displays that otherwise wouldn’t be DV capable. So maybe with a Sony LLDV UHD TV it knows it’s getting a Full range signal and processes it accordingly, but when that same LLDV signal is being tricked by an HDFury and sent to a normal HDR10 display, it thinks it’s getting a Limited Range HDR10 signal but in fact is actually getting Full range and the TV processes it thinking it’s Limited Range, causing a mismatch?
Or maybe I’m not understanding what you wrote and its intentions?
I think only profile 5 from streaming is supposed to be Full range and I think you are correct, it's converted from full to limited when the player does the tonemapping.
I tried to force the full range on the AppleTV files which are somehow signaling a limited range and it didn't make any difference...
btw the trick you found for LLDV in HDR10 with the vertex, well the x700 has been doing exactly that since day 1 when you connect an HDR10 display. So, it is part of the DV SDK capabilities.
The vertex is doing it better though, it doesnt raise the black floor like the x700 does.