View Single Post
Old 30th November 2019, 02:00   #2012  |  Link
Adonisds
Registered User
 
Join Date: Sep 2018
Posts: 14
Quote:
Originally Posted by Adonisds View Post
No, I'm assuming SDR is done with 8 bits and HDR with 10. Why do you assume people would use 10 bit for SDR? Youtube currently uses 8 bits for SDR AV1.

I'm also assuming that these decoders advertised as 4k60 capable are only capable of that in 8 bits. I would a bit surprised if that was not the case.

So possibly they are not really ready for HDR content. And since AV1 is a codec for a time in the future when HDR could be common, that is dissapointing. I would think that since Netflix is pushing hard for it, at least 10bit 4k24 would be supported.
Looks like I'm wrong in my second assumption, thankfully. At least the WAVE510A decoder does decode 10 bit 4k60

Edit: looks like I also misunderstood what Blue_MiSfit said. He just said decoders should do 10bit, not SDR videos. Well, that was all pointless. But let my mistakes all be public. I'm just gonna go hide in shame.

Last edited by Adonisds; 30th November 2019 at 02:03.
Adonisds is offline   Reply With Quote