Quote:
Originally Posted by Blue_MiSfit
I also hope that the dav1d team focuses more on 10 bit soon. I think social media / user generated content is a huge use case for AV1, and most of that content is 8 bit for now.
|
Do we have any evidence that 8-bit sources encode better in 10-bit than 8-bit in AV1? While that was true for H.264, it was much less so for HEVC, and I don't see why AV1 would have any regressions versus HEVC in that regard.
SW encoding and decoding of >8-bit content is always at least 25% slower, and can be more depending on the bottlenecks. And there's really not much point in doing >8-bit unless the source or display controller can do more than that. Most social media is consumed on phones and computers, for which very few end-to-end >8-bit pipelines exist. And with really high ppi, dithering is nigh invisible.
10-bit is much more valuable on living room screens, which are much larger and have native >8-bit support.
I think everything is going to go half float linear light for internal processing next decade, to make tone mapping, particularly of mixed color space content, way easier and better.
Sent from my SM-T837V using Tapatalk