View Single Post
Old 30th November 2019, 01:56   #2010  |  Link
Adonisds
Registered User
 
Join Date: Sep 2018
Posts: 14
Quote:
Originally Posted by Blue_MiSfit View Post
Why would that be the case? The decoder doesn't have to work any harder to decode HDR (assuming you're already doing 10 bit, which you should be).
Quote:
Originally Posted by soresu View Post
Perhaps he means tone mapping for displaying HDR content on SDR screens?
No, I'm assuming SDR is done with 8 bits and HDR with 10. Why do you assume people would use 10 bit for SDR? Youtube currently uses 8 bits for SDR AV1.

I'm also assuming that these decoders advertised as 4k60 capable are only capable of that in 8 bits. I would a bit surprised if that was not the case.

So possibly they are not really ready for HDR content. And since AV1 is a codec for a time in the future when HDR could be common, that is dissapointing. I would think that since Netflix is pushing hard for it, at least 10bit 4k24 would be supported.
Adonisds is offline   Reply With Quote