Bikeriders in the Green area. Don't understand why. It seems worse than the original grade on the UHD.
what ? picture quality (grain details) doesn't matter when you do hybrid since you only take the DV metadata.
What matters is the brightness and it is the same: https://slow.pics/c/3xklvXxC
But DV for a 150nits nits HDR grade is totally useless.
Bikeriders in the Green area. Don't understand why. It seems worse than the original grade on the UHD.
what ? picture quality (grain details) doesn't matter when you do hybrid since you only take the DV metadata.
What matters is the brightness and it is the same: https://slow.pics/c/3xklvXxC
But DV for a 150nits nits HDR grade is totally useless.
reset_9999 i saw your pacific rim video today. is that really the best HDR10 movie?
I don't know if it's the best but it certainly has crazy bright highlights shot after shot. the Average/Average peak is over 1000nits and real maxcll is over 7000nits.
It actually looks way too bright / clipped on my low brightness C2(800nits) but in DV it looks perfect.
reset_9999 i saw your pacific rim video today. is that really the best HDR10 movie?
I don't know if it's the best but it certainly has crazy bright highlights shot after shot. the Average/Average peak is over 1000nits and real maxcll is over 7000nits.
It actually looks way too bright / clipped on my low brightness C2(800nits) but in DV it looks perfect.
as i just said, no point watching static HDR10 when you can either generate or use metadata from web
Just thinking here - what about in the future when HDR10/Dolby Vision is last-gen technology? You think maybe they'll keep Dolby Vision for backwards compatibility on new-gen TV sets? HDR10 only? HDR10+? Maybe we should start acquiring Itunes HDR10+ metadata too?
No idea what will be the next video format but I don't see it happening anytime soon. UHD will probably be the last physical media and with all the energy consumption restrictions/climate crisis, it will take a while before TVs can do 10 000nits at 100% (which would make dynamic metadata useless). Even the brightest OLED can barely do 200nits at 100% but I think I've seen a youtube video (hdtvtest) about a Chinese TV that can do 10 000nits.
So unless there's a breakthrough in TV energy consumption and all the TV can suddenly do 10 000nits , HDR/DV will be the best format for a very long time.
as i just said, no point watching static HDR10 when you can either generate or use metadata from web
Just thinking here - what about in the future when HDR10/Dolby Vision is last-gen technology? You think maybe they'll keep Dolby Vision for backwards compatibility on new-gen TV sets? HDR10 only? HDR10+? Maybe we should start acquiring Itunes HDR10+ metadata too?
I personally think HDR10+ will go the way of the dodo and no chance it will replace Dolby Vision ever, similar to HDDVD back in the day. Samsung is just stubborn because they invested so much in it and don't want to pay license fees to Dolby I'm guessing. However, as soon as it makes more sense financially to drop it or some deal expires with Amazon or Apple and they drop HDR10+ too, even Sammy will likely switch to support DoVi on their displays.
yep, and the colorist has much more controls in DV than in HDR10plus. Not to mention that even without manual trims, DV has 10x better brightness and gamut tone mapping/compression than HDR10plus
Samsung is just stubborn because they invested so much in it and don't want to pay license fees to Dolby I'm guessing
Right but that's what got me thinking - who's gonna pay to license Dolby Vision in the future when it's old tech, for backwards compatibility? Can't see people (comapnies) willing to do that either. Only thing that makes me think we'll (maybe) only have HDR10 and/or HDR10+ for back compat in the future TVs.