View Single Post
Old 26th September 2018, 16:57   #1032  |  Link
benwaggoner
Moderator
 
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,750
Quote:
Originally Posted by IgorC View Post
Would it make sense just drop a bitdepth of 8 bits already? Especially now when AV1 was just realesed. All right, decoder optimizations and hardware support will come anyway.
Yes, 8 bits is faster to encode/decode but banding ruins a major part of quality gains at quite large range of bitrates.

VP9 8-bits vs 10 bits is a day and night difference.
https://sonnati.files.wordpress.com/2016/06/10bit2.png
https://sonnati.wordpress.com/2016/0...ntion-part-ii/

Even VP9/HEVC 10-12bit are already so advanced to produce block-free video even at quite low bitrates.
I wouldn't be surprised if AV1 8 bits would look worse than VP9/HEVC 10-12 bits (or just comparable)
HEVC eliminates most of the 10-bit advantage over 8-bit that H.264 had. If the source doesn’t have banding, you don’t get much new banding even at lower bitrates. I think AV1 should have ballpark similar improvements.

But yeah, it would be great if the “Main” profile for future codecs always supported at least 10-bit. That’s required for HDR, which is quickly becoming mainstream. It’s not like the SoC or GOU vendors are developing 8-bit only decoders anymore, even if some display pipelines are 8-bit RGB. But 10-bit 64-960 4:2:0 Y’CbCr makes for better 0-255 RGB 4:4:4 anyway.
__________________
Ben Waggoner
Principal Video Specialist, Amazon Prime Video

My Compression Book
benwaggoner is offline   Reply With Quote