Quote:
Originally Posted by IgorC
|
Why not compare AV1 10-bit vs. VP9 10-bit?
The barriers to using 10-bit seem pretty similar in both cases, namely longer encoding time, lack of 10-bit sources or processing chains (getting much better), reliance on good dithering in display system, somewhat slower SW decode, and rarer HW decode support.
AFAIK, no one is planning any 8-bit only AV1 decoders, so 10-bit might be able to be used by default more often with AV1 if HW decoders become dominant. SW decoders need more optimization for 10-bit to make it competitive for higher resolutions.
I expect 10-bit to become generally mainstream as it is required for HDR, and we're near or past the tipping point where the majority of new video consumption devices support at least HDR-10.