Quote:
Originally Posted by huhn
this is a maxwell GPU so it has 10 bit HEVC decoder.
even your intel IGPU can do main10.
|
Crikey! So I do have Main10 support... that's interesting and an unexpected turnaround...
Quote:
Originally Posted by videoh
nVidia GPU directly delivers NV12 (via CUVID/NVDec) in either 8-bit or 16-bit. So 10-bit content is padded to 16-bit.
|
Ahh!! I see!!
Ok I think what's going on here then: everything is working fine and the GPU delivers NV12 as 10bit padded to 16bit which is shown incorrectly as it assumes it's a real 16bit and bad things happen when it's finally converted to RGB and displayed to the monitor... This makes sense! And the fact that I force 8bit doesn't solve anything 'cause it's still incorrectly converting to 8bit after the GPU delivered a 10bit disguised as 16bit stream!
Well, now I know what's going on and what's wrong, but I wonder how I can fix it... I guess I'm gonna open a ticket to the MPV guys as MPV just uses the same libraries FFMpeg uses so it's just a matter of getting the right settings to interpret the disguised 10bit and telling it that it's not a real 16bit.
Quote:
Originally Posted by huhn
to verify it you should try a different API. if you have a windows install just try MPC-HC play the same file and see if it works. i highly doubt the hardware here is the issue. and i only know about one card type that had a HEVC decoder limited to 8 bit and i can't even remember the name of that joke of a AMD card.
|
I can't test it on Windows 'cause all my Windows installations on this Laptop (Win98Se, WinXP Pro and Win10 Enterprise) are Virtual Machines, I only use Windows with Avisynth on bare metal at work with my NVIDIA Quadro P4000, but what you wrote makes a lot of sense, so I guess it's just MPV assuming 16bit as a real 16bit instead of a 10bit disguised as 16bit and wrongly converts it to 8bit RGB.
Thank you both, huhn and Donald, you're always an inestimable source of knowledge!